computer-mediated communication impact on the academic and
TRANSCRIPT
COMPUTER-MEDIATED COMMUNICATION IMPACT ON THE ACADEMIC
AND SOCIAL INTEGRATION OF COMMUNITY COLLEGE STUDENTS
David Lynn Dollar, B.S., M.S.
Dissertation Prepared for the Degree of
DOCTOR OF PHILOSOPHY
UNIVERSITY OF NORTH TEXAS
August 2003
APPROVED: John L. Baier, Major Professor Jon I. Young, Minor Professor Ronald W. Newsom, Committee Member and
Program Coordinator for Higher EducationMichael Altekruse, Chair of the Department of
Higher Education M. Jean Keller, Dean of the College of Education C. Neal Tate, Dean of the Robert B. Toulouse
School of Graduate Studies
Dollar, David Lynn, Computer-mediated communication impact on the academic
and social integration of community college students. Doctor of Philosophy (Higher
Education), August 2003, 199 pp., 13 tables, 2 illustrations, references, 154 titles.
Although research findings to date have documented that computer-mediated
communication (CMC) gets students involved, a substantial gap remained in determining
the impact of CMC on academic and social integration of community college students.
Because computer technology, specifically CMC, has proliferated within teaching and
learning in higher education and because of the importance of academic and social
integration, this study was significant in documenting through quantitative data analysis
the impact that CMC had on the academic and social integration of community college
students. The following research question was addressed: Does computer-mediated
communication have an impact on the academic and social integration of community
college students as measured by the CCSEQ? The study hypothesized that data analysis
will show that there will be no difference in the integrations reported by the control and
experimental groups.
The overall approach was to conduct a pretest-posttest control-group experimental
study using CMC as the experimental treatment. The Community College Student
Experiences Questionnaire (CCSEQ) was given to collect data that were used to measure
the academic and social integration of the control and experimental groups. After an in-
depth analysis of data using descriptive statistics, factor analysis, and ANCOVA, the
finding of this study was that there is no statistically significant difference between the
control and experimental groups on their academic and social integrations as measured by
the CCSEQ. In other words, CMC did not have a positive or negative impact on the
integrations of community college students. This study examined for the first time the
impact that CMC had on the integrations of community college students and provided an
experimental methodology that future researchers might replicate or modify to further
explore this topic. Because CMC will continue to increase as technology becomes more
available and accessible to faculty and students and because of the importance of
academic and social integration, further study on this relationship is vital to higher
education research.
ii
TABLE OF CONTENTS
Page
LIST OF TABLES........................................................................................................... iv LIST OF ILLUSTRATIONS........................................................................................... v Chapter
1. INTRODUCTION ......................................................................................... 1
Statement of the Problem .............................................................................. 5 Purpose of the Study ..................................................................................... 6 Research Question and Hypothesis ............................................................... 7 Significance of the Study .............................................................................. 7 Definition of Terms ....................................................................................... 8 Limitations and Delimitations ....................................................................... 8 Organization of the Study ............................................................................. 10
2. REVIEW OF RELATED LITERATURE .................................................... 11
Community Colleges ..................................................................................... 11 Community Colleges and Computer Technology.......................................... 15 Community Colleges and Computer-Mediated Communication .................. 24 Student Involvement and Academic and Social Integration.......................... 31 Tinto’s Student Integration Model............................................................ 32 Astin’s Student Involvement Model ......................................................... 43 C. Robert Pace and Quality of Effort........................................................ 46 Community College Student Experiences Questionnaire (CCSEQ) ........ 48 Summary ........................................................................................................ 53
3. METHODOLOGY ........................................................................................ 54 Research Design and Sample ........................................................................ 54 Data Collection Procedures and Instrument (CCSEQ) ................................. 59 Background, Work, and Family................................................................ 60 College Program ....................................................................................... 61 College Courses ........................................................................................ 61 College Activities...................................................................................... 61 Estimate of Gains...................................................................................... 64 College Environment ............................................................................... 64
iii
Additional Questions ................................................................................ 65 Quality of Effort Scales ............................................................................ 65 Procedures for Data Analysis......................................................................... 67 Testing the Research Question....................................................................... 68 Expected Results............................................................................................ 69
4. RESULTS ...................................................................................................... 71
Research Sample and Descriptive Statistics .................................................. 71 Factor Analyses of Pretests ........................................................................... 77 Academic Integration Factor Analyses .................................................... 78 Social Integration Factor Analyses ........................................................... 81 Analysis of Covariance (ANCOVA) ............................................................. 85 Academic Integration ANCOVA ............................................................. 87 Social Integration ANCOVA ................................................................... 89 Summary of Findings .................................................................................... 91
5. CONCLUSIONS............................................................................................ 92 Introduction ................................................................................................... 92 General Findings ........................................................................................... 93 Methodology ............................................................................................ 93 Sample ...................................................................................................... 96 Interpretation Based on Statistical Procedures ............................................. 97 Interpretation Related to Previous Research ................................................. 99 Limitations ....................................................................................................100 Implications ...................................................................................................102 Conclusions ...................................................................................................103
APPENDICES ................................................................................................................105
A. Community College Student Experiences Questionnaire (CCSEQ) .......105 B. Blackboard Learning System Computer-Mediated Communication ......114 C. Control and Experimental Group Communications.................................124 D. Control and Experimental Group Responses to CCSEQ.........................129 E. Institutional Review Board Approval ......................................................185
REFERENCE LIST .........................................................................................................187
iv
LIST OF TABLES
Table Page Table 3.1. College Activity Topics: Examples of Items in Each Group....................... 63 Table 3.2. Quality of Effort Scales ............................................................................... 66 Table 4.1. CCSEQ Background, Work, Family, and College Program Items.............. 76 Table 4.2. Academic Integration Scales ...................................................................... 79 Table 4.3. Academic Integration Component Matrix ................................................... 80 Table 4.4. Academic Integration Construct Component Matrix................................... 81 Table 4.5. Social Integration Scales ............................................................................. 82 Table 4.6. Social Integration Component Matrix ........................................................ 83 Table 4.7. Social Integration Construct Component Matrix ........................................ 84 Table 4.8. Academic and Social Integration Descriptive Statistics ............................. 85 Table 4.9. Academic Integration ANCOVA................................................................. 88 Table 4.10. Social Integration ANCOVA....................................................................... 90 Table 4.11. Academic and Social Integration ANCOVA .............................................. 91
v
LIST OF ILLUSTRATIONS
Figure Page Figure 2.1. Tinto’s 1975 theoretical schema: The 13 primary propositions................... 33 Figure 2.2. Pace’s (1979) student development and college impress model (p. 126) .... 47
1
CHAPTER 1
INTRODUCTION
Community colleges celebrated their one-hundredth anniversary in America in
2001. Today there are more than 1,600 community colleges that serve in excess of 5
million students. Community colleges enroll almost one-half of all undergraduate
students and more than one-half of all first-time freshmen, not to mention the training
programs designed for employees of local industries (Aslanian, 1997; Baker, 1998).
Community colleges serve a diverse student population with varying ages,
ethnicities, educational goals, educational backgrounds, and socioeconomics. These non-
traditional students take courses for vocational, avocational, certification, or other
utilitarian reasons (Bean and Metzner, 1985). The growth of community colleges has
been phenomenal; however, their success has not been achieved without problems.
Currently, community colleges are struggling to improve their effectiveness by keeping
their students enrolled and increasing their graduation rates. Tinto et al. (1994) claimed
that students are not retained because their colleges are not integrating them academically
or socially. In other words, there is no institutional and individual “fit,” meaning there is
incongruence between the individual and the institution.
In order for community college students to be academically and socially
integrated, they must be involved in the educational process. The 1984 Study Group on
the Conditions of Excellence in American Higher Education defined the importance of
student involvement as: “The more time and effort students invest in the learning process
2
and the more intensely they engage in their own education, the greater will be their
growth and achievement, their satisfaction with their educational experiences, and their
persistence in college, and the more likely they are to continue their learning” (p. 17).
Student involvement has been extensively studied and several models have emerged as
important contributors to educational research. Three of the most important student
involvement models were proposed by Vincent Tinto, Alexander Astin, and C. Robert
Pace.
Tinto’s Student Integration Model (1975) proposed a prospective model of
student persistence that considers a comprehensive set of background and psychosocial
factors. Central to the model is the impact that academic and social integration has on
goal and institutional commitment and on the subsequent decision to persist or withdraw
from the institution. Academic integration is broadly defined as behaviors that students
can engage in on an academic level such as meeting with faculty and advisors, using the
library, and attending out-of-class academic activities. Social integration can be defined
as behaviors related to social involvement that includes meeting other students, making
friends in extra-curricular activities, and attending social and cultural events on campus.
Students who do not engage in behaviors that lead to social and academic integration are
less likely to persist in college and more likely to withdraw (Tinto, 1987).
Astin’s Student Involvement Model proposed in 1977 described student
involvement as “the amount of physical and psychological energy that the student
devotes to the collegiate experience. Thus, a highly involved student is one who, for
example, devotes considerable energy to studying, spends much time on campus,
participates actively in student organizations, and interacts frequently with faculty
3
members and other students” (Astin, 1984, p. 297). The more active and involved
students are, the more academically and socially integrated they become, which leads to
higher satisfaction. Students who are more satisfied persist at higher levels than those
who are not satisfied (Astin; 1977, 1993).
Pace (1979, 1984, 1998) was convinced that the breadth and scope of student
involvement was essential to the quality of undergraduate education. Pace believed that
the quality of effort that students themselves invest in using the facilities and
opportunities for learning and development that exist in the college setting was the most
influential variable in student growth and development. Pace theorized that education is
both a process and a product and that the quality of the process could be measured. Pace
operationalized his model by developing an instrument called the College Student
Experiences Questionnaire (CSEQ) to measure the involvement of college students. The
CSEQ developed by Pace (1979) was modified by Friedlander, Pace, and Lehman (1990)
to focus on student involvement at two-year colleges and was aptly named the
Community College Student Experiences Questionnaire (CCSEQ). The CCSEQ
endorsed the concept that the greater the involvement of students, the greater progress
students report making while enrolled at that institution.
A pervasive theme in the literature is the challenge for community colleges to
embrace the future and adapt curriculum and delivery methods so as to best meet the
needs of their students and communities (Baker, 1998, 1999; Bryant 1998; Lazarick,
1998; Raisman, 1999; Travis & Travis, 1999). One of the most radical changes and
challenges that has taken place in higher education is the shift to using computer
technology. Pascarella and Terenzini (1998) reported that new technologies offer
4
“opportunities to expand access to higher education, to respond to diverse student
learning styles, to provide vehicles for active student involvement, and to reduce costs”
(p. 159). Computer technology can help community college educators tailor instruction
to the diversity of learning styles, skill levels, cultural differences, motivations, and
educational objectives of its increasingly pluralistic student body (Doucette, 1994).
Friedlander (1993) summed up the technology challenge for community colleges in this
way: “Creative approaches need to be developed that utilize the capabilities of
educational technology in the learning process while maintaining the critically important
human interaction between students and faculty” (pp. 5-6).
Bower (1998) suggested a number of organizational and political realities for
using computer technology in instruction as a viable alternative for community colleges.
First, community colleges must continue to use computer technology to prepare students
for positions in the workplace. Second, community colleges place strong emphasis on
teaching. Third, computer technology can be instrumental in helping community college
instructors meet the individual needs of diverse community college students with varied
academic preparedness. And a fourth factor is the high value placed on student access.
Well-designed instruction using computer technology can provide quality learning
experiences to more community college students in places and at times conducive to
individual learning styles and schedules (Bower, 1998).
Computer technology that supports human communication has been given the
name Computer-Mediated Communication (CMC). CMC refers to computer applications
that facilitate human-to-human communication and includes electronic mail,
asynchronous group conferencing systems (bulletin boards and listservs), and
5
synchronous interactive chat systems (Berge & Collins, 1995). Santoro (1995) defined
CMC as the use of computer systems and networks for the transfer, storage, and retrieval
of information among humans as a tool for instructional support.
Berge & Collins (1995) found that CMC is changing instructional methods by
generating improved technological tools that allow classes to use a fuller range of
interactive methodologies and encouraging teachers and administrators to pay more
attention to the instructional design of courses. Both of these factors can improve the
quantity, quality, and patterns of communication that students practice during learning.
As it is currently being used for instructional support, CMC provides electronic mail,
bulletin boards, listservs and real-time chat capabilities, delivers instruction, and
facilitates student-to-student and student-to-teacher interactions. These uses are
promoting several paradigmatic shifts in teaching and learning, including the shift from
instructor-centered education to student-centered learning and the merging of informal
dialogues, invisible colleges, oral presentations, and scholarly publications into a kind of
dialogic virtual university (Berge & Collins, 1995).
Statement of the Problem
Most of the studies on academic and social integration and the behaviors students
can engage in to achieve such integration were developed before the infusion of
technology into higher education. Only two studies were located during the extensive
literature review process that related computer technology and academic and social
integration. Gatz and Hirt (2000) conducted a study at a large, public, research university
to gain a better understanding of whether email was replacing traditional behaviors in
which college students engage to achieve academic and social integration. The results
6
indicated that while the participants did use email for some academic and social
integration purposes, the bulk of their email activity did not relate to either form of
integration. Ashmore (2000) examined computer engagement of students at a two-year
school to determine if computer engagement had an impact upon perceived growth and
development, and if this engagement had an effect on academic and social involvement.
Information was collected by administering the CCSEQ to 800 students at a two-year
college in West Tennessee. The results showed that computer usage was marginally
(explained less than 1 % of the variance) significant in affecting the Career Development,
Communication, and Math/Science/Technology outcome variables. However, the
computer-usage variable did not alter the effects of academic and social involvement.
Holden and Mitchell (1993) concluded that the role of CMC in colleges will
likely continue to alter parts of the instructional process. By allowing students and
faculty to communicate when and where it is convenient, CMC can make the teaching
and learning process more flexible and instruction more effective. CMC research has
documented that student involvement is one of the many positive instructional benefits of
using CMC. However, an extensive review of the literature revealed no research studies
relating CMC to the academic and social integration of college students.
Purpose of the Study
Therefore, the purpose of this study was to determine the impact that CMC has on
the academic and social integration of community college students. The overall approach
was to conduct an experimental study using computer-mediated communication as the
experimental treatment. The Community College Student Experiences Questionnaire
(CCSEQ) was given as a pretest and posttest to measure the academic and social
7
integration of the control and experimental groups. The data collected for the control
and experimental groups were analyzed to see if the computer-mediated communication
treatment had an impact on the academic and social integration of community college
students.
Research Question and Hypothesis
The following research question was addressed in this study:
Does computer-mediated communication have an impact on the academic and
social integration of community college students as measured by the CCSEQ?
The study hypothesized that data analysis will show that there will be no
difference in the academic integration and social integration reported by the control and
experimental groups. This study attempted to answer this question through the analysis
of data gathered utilizing the Community College Student Experiences Questionnaire
(Friedlander, Pace, & Lehman, 1990).
Significance of the Study
Although research findings to date have documented that computer-mediated
communication gets students involved, a substantial gap remained in determining the
impact of computer-mediated communication on academic and social integration of
community college students. Because computer technology, specifically computer-
mediated communication, has proliferated within teaching and learning in higher
education and because of the importance of academic and social integration, this study
was significant in documenting through quantitative data analysis the impact that
computer-mediated communication had on the academic and social integration of
community college students.
8
Definition of Terms
For the purposes of this study, the following terms are defined:
Computer-mediated communication - computer applications for direct human-to-
human communication including electronic mail, asynchronous group conferencing
systems (bulletin boards and listservs), and synchronous interactive chat systems for the
transfer, storage, and retrieval of information among humans as a tool for instructional
support (Berge & Collins, 1995; Santoro, 1995).
Academic integration - behaviors that students can engage in on an academic
level such as meeting with faculty and advisors, using the library, and attending out-of-
class academic activities (Tinto, 1987).
Social integration - behaviors related to social involvement that includes meeting
other students, making friends in extra-curricular activities, and attending social and
cultural events on campus (Tinto, 1987).
Limitations and Delimitations
The relevant limitations of this study included the following:
1. Because the researcher was unaware of any other instructors who augment
their classes with computer-mediated communication at the community
college used for this study, the researcher was also the instructor of the
research participants.
2. Because the research participants from the same class were randomly assigned
to the control and experimental groups, there could have been some diffusion
of treatment between the two groups.
9
The relevant delimitations of this study included the following:
1. One area of concern in any quantitative study is the validity and
reliability of the instrument specified for gathering data. The
instrument used in this study was the Community College Student
Experiences Questionnaire (Friedlander, Pace, & Lehman, 1990). The
validity and reliability of this instrument, which has been adequately
tested since 1991, is discussed in Chapter 3.
2. The timeframe for data collection was one full semester (16 weeks).
This timeframe may have been too short to adequately measure
academic and social integration.
3. By design, the study was limited to public two-year college students
enrolled in the researcher’s Chemistry 1406 course. Thus, the study
did not incorporate other non-traditional students or independent two-
year college students.
4. In terms of the quantitative research design, a questionnaire (i.e., the
CCSEQ) was applied as a specifically-designed data collection device
for the community college setting with perception bias of the
respondents.
Despite these limitations and delimitations, this study examined for the first time
the impact that computer-mediated communication had on the academic and social
integration of community college students with supporting documentation through
quantitative data analysis. This study also provided an experimental methodology that
future researchers might replicate or modify to further explore this topic.
10
Organization of the Study
This study was organized into five chapters. Chapter 1, Introduction, presents the
purpose of the study. statement of the problem, research question, significance of the
study, definition of terms, and delimitations. Chapter 2, Review of Related Literature,
reviews the current research on the topics specifically related to this research that
includes the following sections: community colleges; community colleges and
technology; community colleges and computer-mediated communication; and student
involvement theories proposed by Tinto, Astin, and Pace dealing with academic and
social integration and the Community College Student Experiences Questionnaire
(CCSEQ). Chapter 3, Methodology, discusses the data collection procedures and
instrument (CCSEQ), research design and sample, procedures for data analysis, testing
the research question, and the expected results. Chapter 4, Results, summarizes the
quantitative data analysis. And Chapter 5, Conclusions, discusses the findings and their
implications for future practice and research.
11
CHAPTER 2
REVIEW OF RELATED LITERATURE
Because computer technology, specifically computer-mediated communication,
has proliferated within teaching and learning in higher education and because of the
importance of academic and social integration, this study was significant in documenting
through quantitative data analysis the impact that computer-mediated communication has
on the academic and social integration of community college students. Although
research findings to date have documented that computer-mediated communication gets
students involved, a substantial gap remained in determining the impact of computer-
mediated communication on academic and social integration. In order to develop and
study the research question posed in this study, an extensive review of the literature was
pertinent. This review of the literature related to the impact of computer-mediated
communication on the academic and social integration of community college students is
divided into the following topics: (1) Community Colleges, (2) Community Colleges and
Computer Technology, (3) Community Colleges and Computer-Mediated
Communication, and (4) Student Involvement and Academic and Social Integration
including Tinto’s Student Integration Model, Astin’s Student Involvement Model, Pace
and Quality of Effort, and the Community College Student Experiences Questionnaire
(CCSEQ).
Community Colleges
Community colleges celebrated their one-hundredth anniversary in America in
2001. Previously known as “junior colleges,” there were eight private two-year schools
in America in 1901. The term “junior college” was most often used for independent,
12
private, or church-affiliated schools, while “community college” was primarily used by
publicly supported institutions. By the 1970s, the term “community college” was most
widely known and utilized for all two-year schools and was further defined as any
institution accredited to award the Associate in Science or the Associate in Arts as its
highest degree (Cohen & Brawer, 1991).
The community college was originally intended to provide high school graduates
with the first two years of their liberal arts education thereby fulfilling their general
education requirements before transferring to four-year institutions (Baker, 1998, 1999;
Cohen & Brawer, 1991). Skill training and general education curriculum was offered.
The decline of blue collar jobs and the increase of opportunities in the business,
professional and technical fields created a need for specialized training (Bean & Metzner,
1985). Community colleges responded to these training needs by creating partnerships
with local industry to supply labor market training needs (Baker, 1998; Bryant, 1998;
Cohen & Brawer, 1991; Lazarick, 1998; Phelps, 1994).
According to Baker (1998), today there are more than 1,600 community college
campuses that serve in excess of 5 million students. A large increase in community
college enrollment occurred during the 1960s and 1970s because of G.I. bill educational
benefits, high birthrate in the 1940s, increased state funding, and the accessibility of
community college campuses (Cohen & Brawer, 1991). Moreover, the Russian launch of
Sputnik gave impetus to the National Education Act of 1958 and the Higher Education
Act of 1965. Both of these acts legislated that funding to support higher education should
be available from both the state and federal levels. This additional funding promoted
college attendance (Bean & Metzner, 1985). As a result, community colleges today
13
enroll almost one-half of all undergraduate students and more than one-half of all first-
time freshmen, not to mention the training programs specifically designed for employees
of local industries (Aslanian, 1997). Furthermore, community college enrollments are
projected to increase 11 percent by the year 2003 (Pasacarella & Terenzini, 1998; United
States Department of Education, 1997).
Cohen and Brawer (1991) used the words “number and variety” to describe the
diverse student population being served by community colleges. This diversity includes
varying ages, ethnicities, educational goals, educational backgrounds, and
socioeconomics (Aslanian, 1997; Bean & Metzner, 1985). A 1997 survey by the North
Carolina State University’s Student Assessment of the College Environment showed that:
a) the age of community college students ranges from younger than 20 to over 70, b)
there is a higher percentage of females (57 %) than males (43 %), c) students can be
either full-time (52 %) or part-time (48%), d) have various attendance patterns,
motivations, long-term goals, and family responsibilities, and e) have incongruous basic
education skills, including reading, writing, computing and thinking (Baker, 1998).
Students attend community colleges for a variety of reasons. Community colleges
are responsible for providing adult education as well as educational, recreational and
vocational activities (Cohen & Brawer, 1991). Baker (1999) ascertained from the
Student Assessment of the College Environment survey that sixty-two percent of
community college students, regardless of age, stated their main goal for attending
college was an immediate career objective. Thirty-two percent of the respondents
indicated the desire to transfer to a four-year institution and continue their education.
And some community colleges students have no degree aspirations but are taking courses
14
they feel will enhance their career objectives. Moreover, Bean and Metzner (1985)
depicted non-traditional students as frequently taking courses for “vocational,
avocational, certification, or other utilitarian reasons” (p. 489).
The growth of community colleges has been phenomenal. However, their success
has not been achieved without problems. Currently, community colleges are struggling
to improve their effectiveness by keeping their students enrolled. Pascarella & Terenzini
(1991) found that “when assessed over the same period of time, baccalaureate aspirants
who enter two-year colleges tend to have lower levels of educational and degree
attainment than do comparable individuals who enter four-year institutions” (p. 373).
Community colleges also contend with the issue of graduation rates. According
to Napoli and Wortman (1996), “graduation rates [in community colleges] are
substantially lower. Less than 39% of students complete their associate degree within 3
years of initial entry.” Nationally, “first year departure rates are 28% and 48% for 4-year
and 2-year public colleges, respectively” (p. 6); and typically, “only one-third of all first-
time full-time [community college] students earn associate degrees or certificates” (Tinto,
Russo, & Kadel, 1994, p. 26). Tinto et al. (1994) claimed that students are not retained
because their colleges are not integrating them academically or socially. There is no
institutional and individual “fit,” meaning there is an incongruence between the
individual and the institution.
Community colleges must look towards the future and be prepared to make
changes just like the corporate sector has to constantly reinvent itself to stay on top of the
market. Baker (1999) stated that the community college system “cannot deliver on
promises unless and until we restructure ourselves” (p. 35). Travis and Travis (1999)
15
described the varied issues confronting community colleges as “consistently fluid” (p.
20). The pervasive theme in the literature is the challenge for community colleges to
embrace the future, and adapt curriculum and delivery methods so as to best meet the
needs of their students and communities (Baker, 1998, 1999; Bryant 1998; Lazarick,
1998; Raisman, 1999; Travis & Travis, 1999).
According to Cohen and Brawer (1991), today’s community colleges emphasize
curriculums that are to meet the needs of the student population, community, and labor
market. “The [community] college may, and is likely to, develop a different type of
curriculum suited to the larger and ever-changing civic, social, religious, and vocational
needs of the entire community in which the college is located” (Cohen & Brawer, 1991,
pp. 3-4). According to Smith and Baxter (1994), “Higher education needs to begin
addressing these hard questions in a serious manner…(and) to enhance our ability to
serve a world undergoing dramatic change—economically, demographically, socially,
globally, and technologically” (pp. 37-38). According to Baker (1999), the year 2000
will “become a watershed for finally delivering on access, academic, and diversity issues
talked about in the twentieth century” (p. 33). One of the most radical changes that has
taken place in higher education is the shift to technology in the classroom.
Community Colleges and Computer Technology
The tightening bond between the computer and higher education began just after
World War II. Research universities were the early adopters of computer technology and
have continued to invest in this technology. The introduction of small-scale computers
increased the infusion of computers into education. Research universities computer use
gradually spread from research activities to administrative use to student access.
16
However, at community colleges a number of factors accelerated the movement of
computer use from administration directly to student-centered instructional applications.
Accelerating student computer access in community colleges was part of the mission to
provide quality technical and career education. Community college students needed
access to computers to acquire the necessary skills to be competitive in business and
industry so community colleges pushed to grant student computer access (Bower, 1998).
At first computers were the subject of instruction in computer hardware repair and
computer programming classes. As more user-friendly applications appeared such as
spreadsheets and word processors, faculty discovered that these business-oriented tools
could also be used in teaching and learning. As computers and their applications became
integrated into the ways people and businesses work, they also became a part of
community college instructional practices (Bower, 1998; Doucette, 1994).
In 1965, less than 5% of American college students had access to computing
services that adequately met their needs (Heterick, 1993). In 1996, the American
Association of Community Colleges (AACC) conducted a computer-use survey that
showed that 96% of community colleges had on-campus computer access for students
with a community college student-computer ratio of 23 to 1 (Kienzl & Li, 1997).
Another 1996 survey by the Campus Computing Project found that 31% of community
college courses used computer-based classrooms or labs (Green, 1996).
Hopkins (1998) reported that 69% of colleges and universities have Internet and
World Wide Web access available to their students. Many community college students
have access to computer technology through the availability of computer labs on campus.
Doucette (1994) stated that there are two types of technology on campuses: the type that
17
enables students and faculty to become better at what they already know, and the type
that changes the way that faculty teach and their students learn. The new teaching and
learning technologies including the World Wide Web, file-transfer protocols, listservs,
bulletin boards, gophers, interactive TV, cable and satellite transmission, and computer
conferencing have made the delivery of coursework available to students anywhere and at
anytime. Instructors can place their lessons on a website and the students can log on to
the site at their own convenience from any location. Recommendations for facilitating
student-faculty interaction through technology include the following methods: providing
e-mail access so that students can interact with faculty outside the classroom, creating on-
line appointments for faculty and students to meet and assess projects, creating chatrooms
for students to discuss topics, and encouraging the use of technology for creative
engagement (Acebo et al, 1998; Bigelow, 1993; Doucette, 1994; Langhorst, 1997; Paine,
1996; Pascarella & Terenzini, 1998; Privateer, 1999).
This shift in coursework delivery methods has created new questions for
community college educators. Pascarella and Terenzini (1998) stated that one of these
questions concerns the “reconsideration and redefinition of conventional understandings
of faculty and student roles and the responsibilities in the teaching and learning process,
as well as shifts in the ways students and faculty interact both in and out of the
classroom” (p. 159). Pascarella and Terenzini (1998) reported that new technologies
offer “opportunities to expand access to higher education, to respond to diverse student
learning styles, to provide vehicles for active student involvement, and to reduce costs”
(p. 159). There is an overarching body of literature that supports this statement (e.g.,
Acebo, Burrus, & Kanter 1998; Bigelow, 1993; Bryant, 1994; Doucette, 1994;
18
Friedlander, 1993; Langhorst, 1997; Paine, 1996; Privateer, 1999). Doucette (1994)
emphasized that colleges have no choice but to keep abreast of changing technologies
because “the world from which our students come and the world of work for which we
prepare them have been thoroughly infused with technology” (p. 20). Pasacarella and
Terenzini (1998) stated that “current, emerging, and as-yet-undreamed-of information
technologies are forcing serious reconsideration of our assumption of how, when and
where instruction (and education more broadly) can be delivered and learning promoted”
(p. 162-163).
The community college’s diverse learner population includes academically
underprepared students, traditional-age students, returning adult students, socially and
economically disadvantaged students, physically handicapped students, academically
talented students, learning disabled students, and international students. For this reason,
community college educators are drawn to the capabilities and possibilities of computer-
based instructional technology (Bower, 1998). This technology can help community
college educators tailor instruction to “the diversity of learning styles, cultural
differences, skill levels, motivations, and educational objectives of an increasingly
pluralistic student body” (Doucette, 1994, p. 24). Land & Haney (1989) emphasized that
as community colleges continue to fulfill their mission of serving the broad range of
interests and abilities exhibited in their communities and students populations, they must
strive to improve teaching and learning strategies.
Since the Seven Principles of Good Practice in Undergraduate Education were
developed (Chickering & Gamson, 1987), new communication and information
technologies have become major resources for teaching and learning in higher education.
19
If the power of these new technologies is to be fully realized, they should be employed in
ways consistent with these Seven Principles. Chickering & Ehrmann (1996) described
some of the most effective and appropriate ways to use technology to advance the Seven
Principles.
1. Good Practice Encourages Contacts Between Students and Faculty. Electronic mail, computer conferencing, and the World Wide Web increase opportunities for students and faculty to converse and exchange work much more speedily than before, and more thoughtfully and “safely” than when confronting each other in a classroom or faculty office. Total communication increases and, for many students, the result seems more intimate, protected, and convenient than the more intimidating demands of face-to-face communication with faculty. With the new media, participation and contribution from diverse students become more equitable and widespread.
2. Good Practice Develops Reciprocity and Cooperation Among Students. The extent to which computer-based tools encourage spontaneous student collaboration was one of the earliest surprises about computers. A clear advantage of email for today’s busy commuting students is that it opens up communication among classmates even when they are not physically together.
3. Good Practice Uses Active Learning Techniques. The range of technologies that encourage active learning is staggering. Many fall into one of three categories: tools and resources for learning by doing, time-delayed exchange, and real-time conversation.
4. Good Practice Gives Prompt Feedback. The ways in which new technologies can provide feedback are many—sometimes obvious, sometimes more subtle. We already have talked about the use of email for supporting person-to-person feedback, for example, and the feedback inherent in simulations. Computers also have a growing role in recording and analyzing personal and professional performance.
5. Good Practice Emphasizes Time on Task. New technologies can dramatically improve time on task for students and faculty members. Technology also can increase time on task by making studying more efficient. Teaching strategies that help students learn at home or work can save hours otherwise spent commuting to and from campus, finding parking places, and so on. Time efficiency also increases when interactions between teacher and students, and among students, fit busy work and home schedules.
6. Good Practice Communicates High Expectations. New technologies can communicate high expectations explicitly and efficiently. Significant real-life problems, conflicting perspectives, or paradoxical
20
data sets can set powerful learning challenges that drive students to not only acquire information but sharpen their cognitive skills of analysis, synthesis, application, and evaluation.
7. Good Practice Respects Diverse Talents and Ways of Learning. Technological resources can ask for different methods of learning through powerful visuals and well-organized print; through direct, vicarious, and virtual experiences; and through tasks requiring analysis, synthesis, and evaluation, with applications to real-life situations (pp. 3-6).
The Seven Principles cannot be implemented by technophiles and faculty alone.
Students need to become familiar with these Principles and be more assertive with
respect to their own learning. Faculty members who already work with students in ways
consistent with the Principles need to be tough-minded about the technology-assisted
interactions they create and buy into. And institutional policies concerning learning
resources and technology support need to give high priority to user-friendly hardware,
software, and communication vehicles that help faculty and students use technologies
efficiently and effectively (Chickering & Ehrmann, 1996).
According to Kuh and Hu (2001), most of what is known about the effects on
achievement of using computing and information technology is based on student
performance in individual courses. Two recent studies examined the impact of
computing on outcomes other than achievement and content-specific knowledge across
multiple institutions. Kuh and Vesper (2001) reported that after controlling for such
factors as college grades, age, gender, hours worked per week, parents’ education, and
educational aspirations, students’ self-reported gains in becoming familiar with
computers were highly correlated with self-assessed gains in a variety of other areas such
as independent learning, writing clearly, and problem solving. Flowers, Pascarella, and
Pierson (2000) also controlled for many of the same potentially confounding influences
21
but also included precollege cognitive development and motivation. They found that
computer and E-mail use had only trivial and nonsignificant effects on end of first year
composite cognitive development, reading comprehension, mathematics, and critical
thinking. At the same time the use of computers and E-mail significantly affected the
cognitive growth of students attending 2-year colleges. It was not clear why information
technology had a greater effect on two-year college students than their counterparts at
four-year institutions.
Although research findings to date are generally promising, a substantial gap
remains in understanding the effects of computer and information technology on student
learning and other educational outcomes (Morrison, 1999). Kuh and Hu (2001)
examined the relationships between student characteristics, student use of computers and
other information technologies (C&IT), the amount of effort they devote to other college
activities, and self-reported gains in a range of desirable college outcomes. Based on an
analysis of responses to the College Student Experiences Questionnaire from 18,344
undergraduates at 71 four-year colleges and universities, students appeared to benefit
more from C&IT when they used it frequently and in a variety of ways. Equally
important, using C&IT was positively related to educational effort with the effects of
C&IT on outcomes of college being largely mediated through the educational efforts
students put forth. Thus, C&IT use appears to have a general beneficial influence on the
overall learning environment.
Gatz and Hirt (2000) conducted a study at a large, public, research university to
gain a better understanding of whether e-mail was replacing traditional behaviors in
which college students engage to achieve academic and social integration. Data
22
consisted of printouts of email records with corresponding logsheets detailing the
relationship of the participant to the sender/receiver of each message and the general
nature of the message. Additional data included answers to email survey questions and
lists of traditional academic and social integration behavior against which the e-mail
behavior categories were compared. The results indicated that while the participants did
use email for some academic and social integration purposes, the bulk of their email
activity did not relate to either form of integration. Participants, regardless of gender,
seemed to be using email to communicate extensively with family members and high
school friends. Finally, the participants spent a considerable amount of time checking,
writing, composing and sending email messages. These trends suggest that email has
become an integral part of college student life and that college administrators need to
explore new and effective ways to ensure that the use of email is beneficial to the overall
development of college students.
Ashmore (2000) examined computer engagement of students at a two-year school
to determine what, if any, impact this computer engagement had upon perceived growth
and development, and if this engagement had an effect upon academic and social
involvement. Information was collected by administering the CCSEQ to 800 students at
a two-year college in West Tennessee. Two regressions were performed on five outcome
variables: Career Development, Communications, Math/Science/Technology,
Personal/Social Development, and Perspectives of the World. The first regression was
performed without the variable for Computers. Then, the computer-usage variable was
added into the second regression equation to test for significance and effects upon other
types of involvement. The regression equations including the computer-usage variable
23
showed that computer usage was marginally (explained less than 1% of the variance)
significant in the outcome variables Career Development, Communications, and
Math/Science/Technology. Finally, the computer-usage variable did not alter effects of
academic and social involvement.
Instructional computer technology can assist students in building competencies
and minimizing academic phobias. It can assist teachers in providing exciting and
relevant learning experiences and revitalize their enthusiasm for teaching. It can even
empower students to “discover new capabilities within themselves” (Simone, 1992, p. 5).
To paraphrase Luna and McKenzie (1997): Is computer based instruction popular with
students and educators? Yes. Does it improve student performance? Maybe. Is it worth
the cost? Probably. Must we continue to explore this innovative pathway to education?
Definitely. Bower (1998) concluded: “There is evidence of a positive effect of computers
in the curriculum, but more research is needed to tease out the interaction of learner,
curriculum, and technology” (p. 65).
According to Privateer (1999), higher education is at a “strategic academic
technology crossroad” (p. 69). University administrators need to explore how academic
technology policies will impact higher education into the next millennium. Moreover,
students, as well as institutions, should be prepared for the opportunity of “enhancing
their intelligence by using information technologies…in ways that as of yet have not been
articulated” (p. 77). Friedlander (1993) summed up the technology challenge for
community colleges in this way: “Creative approaches need to be developed that utilize
the capabilities of educational technology in the learning process while maintaining the
critically important human interaction between students and faculty (pp. 5-6).
24
Community Colleges and Computer-Mediated Communication
Bower (1998) suggested a number of organizational and political realities for
using computer technology in instruction as a viable alternative for community colleges.
First, community colleges must continue to use computer technology to prepare students
for positions in the workplace. Second, community colleges place strong emphasis on
teaching. Third, computer technology can be instrumental in helping community college
instructors meet the individual needs of diverse community college students with varied
academic preparedness. And a fourth factor is the high value placed on student access.
Well-designed instruction using computer technology can provide quality learning
experiences to more community college students in places and at times conducive to
individual learning styles and schedules (Bower, 1998).
Computer-Mediated Communication (CMC) is the name given to a large set of
functions that use computers to support human communication. CMC refers to computer
applications for direct human-to-human communication and includes electronic mail,
group conferencing systems, and interactive ‘chat’ systems. Santoro (1995) defined
CMC as the use of computer systems and networks for the transfer, storage, and retrieval
of information among humans as a tool for instructional support. As it is currently being
used for instructional support, CMC provides electronic mail and real-time chat
capabilities, delivers instruction, and facilitates student-to-student and student-to-teacher
interactions. These uses are promoting several paradigmatic shifts in teaching and
learning, including the shift from instructor-centered education to student-centered
learning and the merging of informal dialogues, invisible colleges, oral presentations, and
scholarly publications into a kind of dialogic virtual university (Berge & Collins, 1995).
25
During this information age, no one can deny that power comes to those who have
information and know how to access it. Developing self-motivated learners and helping
people learn to find and share information are the most important goals when considering
the educational factors of CMC in the information age. When designed well, CMC
applications can be used effectively to facilitate collaboration among students as peers,
teachers as learners and facilitators, and guests or experts from outside the classroom
(Berge & Collins, 1995).
CMC has begun to receive attention in recent research literature. Research
studies in CMC have focused on the quality of group decision making, interaction
analysis, member participation, the effects of time and space on group interaction, the
quality and dynamics of the interaction, and as a teaching tool. The results are mixed on
the outcomes of the medium on students as individuals and in groups. However, the tools
are in place to provide students with the opportunity to participate in CMC which will
prepare them to accept and adapt to this medium in the classroom and workplace (Everett
& Ahern, 1994). Berge & Collins (1995) found that CMC is changing instructional
methods by generating improved technological tools that allow classes to use a fuller
range of interactive methodologies and encouraging teachers and administrators to pay
more attention to the instructional design of courses. Both of these factors can improve
the quantity, quality, and patterns of communication that students practice during
learning.
One of the greatest benefits of CMC is its ability to liberate instruction from the
constraints of time and distance. The convenience of access from home, school, or office
permits students and instructors to better meet travel, job, and family responsibilities.
26
CMC promotes self-discipline and requires students to take more responsibility for their
own learning. CMC motivates and encourages students to become involved in projects
and to write for a real audience of their peers or people in the larger world community,
instead of merely composing assignments for the instructor. Other potential benefits of
CMC include promoting multicultural awareness, reducing the sense of isolation
sometimes felt by students and teachers, and meeting numerous learning and personal
needs of students with CMC’s flexibility and variety (Berge & Collins, 1995).
In her review of 20 years worth of CMC literature, Harasim (1990) suggests
several key differences that distinguish computer-mediated conversations from face-to-
face discussions: (1) place dependence, (2) time dependence, (3) structure of
communication, and (4) richness of communication. The first and most profound
difference between face-to-face and computer-mediated communication is that face-to-
face communication is place-dependent. Computer-mediated communication is place-
independent. Computer-mediated communication takes place in “cyberspace,” the world
of intersecting computer networks in which individuals access files, read mail, and talk to
one another. Second, face-to-face discussions must occur not only in the same location
but also at the same time; in electronic discussions conversation is normally
asynchronous. Althaus (1997) added that computer-mediated communications that do
not occur in real time allow students to log on and join the discussion when it is
convenient for them; send messages simultaneously without crowding one another or
disrupting the discussion flow; have more time to read messages posted by others, reflect
on them, and compose thoughtful responses; and participate at a self-pace. McComb
(1994) stated that “CMC extends the learning dialogue beyond the classroom. Instructors
27
are more available to their students; yet asynchronicity enables this extra-classroom
communication to occur at convenient times and places for all concerned. CMC also lets
instructors witness the learning that their students engage in outside of class and is a
gateway to outside resources” (p. 165).
Third, Harasim (1990) stated that most classroom interactions involve one of
three kinds of communication structures: one to one, one to many, or many to many.
Typically, classroom discussions follow the one-to-many model. Instructor-centered
discussions may appear to be collaborative, but often the instructor style keeps students
from interacting. In contrast, on-line discussions are naturally interactive and
collaborative, in part because it lends itself to many-to-many communication. Whereas
much of classroom discussions tend to be instructor dominated, instructors in on-line
discussions tend to contribute a much smaller proportion of messages in computer-
mediated exchanges. And fourth, compared to face-to-face interactions, computer-
mediated communication conveys a stream of textual information. Text-base
communicators often become more reflective than verbal communicators, more attentive
to the messages of others, and are put on more equal social footing with one another
(Harasim, 1990).
Althaus (1997) examined whether supplementing face-to-face communication
with computer-mediated communication enhanced the academic performance of
undergraduate students in large lecture courses. Student evaluations and academic
performance data from this quasi-experimental study suggested that a combination of
face-to-face and computer-mediated discussion provides a learning environment superior
to that of the traditional classroom. Students actively involved in CMC groups not only
28
reported learning more than they otherwise would have, but they also tended to earn
higher grades than students taking part in face-to-face discussions only. Research by
Chapman (1998) concluded that when given the time and opportunity to use CMC to
support their learning in campus-based college courses, most students perceived it as a
very beneficial tool. However, access and time commitment were identified as potential
factors that could undermine student use of CMC.
Another benefit of CMC is that it balances power by equalizing control among
participants and gives students practice in exerting control, and thus responsibility. CMC
requires responsible behavior by students. Students take responsibility for their own
learning in the class instead of just sitting back and expecting the teacher to do all the
work. Education becomes a shared responsibility with CMC augmentation (McComb,
1994). Augmenting classes with CMC makes course preparation more efficient, and
thereby frees up energy and time for researching course content and for interacting with
students. With CMC, instructors can put all course materials online for students to access
at their convenience. CMC also allows for online submission and marking of
assignments, again lessening the academic paper chase. CMC provides students and
instructors with online files of coursework. Another efficiency factor of CMC is the
ability to communication directly with a particular group within a larger class. CMC
reduces the endless paperwork and makes course materials readily available for
consultation. This efficiency leaves instructors more time for substantive interaction
(McComb, 1994). McComb (1994) concluded: “CMC is not a panacea or a cure-all for
traditional linear models of instructional communication. Nor is it a replacement for
face-to-face communication. Adding CMC to course design will not automatically mean
29
that students will take advantage of it or that they will suddenly show more initiative and
responsibility…CMC augmentation is invaluable for a pedagogy that aims to nurture
involvement and initiative in students” (p. 169).
Clay-Warner and Marsh (2000) examined the use of computer-mediated
communication (CMC) in the college classroom. They found that students are more
favorably disposed to using CMC in ways that were both familiar to them and did not add
requirements to the course. Students preferred that CMC be offered as an information
supplement that they may access at their convenience and that requiring students to use
CMC is associated with lower overall ratings. Instructors can use CMC as a way to
increase their contact with students as well as increase students’ contact with course
material and with each other. Doing so can enhance traditionally taught courses by
providing additional ways for intellectual exchange and student involvement. Everett and
Ahern (1994) concluded that the use of CMC as a teaching tool can have a positive effect
on students and interpersonal interaction. Their study also concluded that students should
receive some sort of outside reward for participating in CMC and some kind of
mechanism that reports on students’ participation is useful because it encourages
continued participation and applies peer pressure to those who have not been as diligent.
Computer-mediated communication via electronic mail (e-mail), electronic
bulletin boards, and computer conferences has provided a new communication medium
for students and teachers. In an effort to gather information that would assist in more
clearly defining the future of CMC in higher education, Holden and Mitchell (1993)
conducted an Electronic Communication in Instruction study using the Delphi Technique.
The Delphi study was conducted using a panel of 35 higher education faculty members
30
from around the world who actively use CMC in teaching. Based on the predictions of
the Delphi study, the conclusions that follow are presented as recommendations for future
activities.
1. College campuses should provide free and convenient network access for all faculty and students.
2. Colleges should provide for faculty members the opportunity to develop the additional teaching skills needed to implement instructional CMC applications.
3. Colleges should provide faculty with more time to develop and use CMC applications, and the increased time may take different forms: either released time to attend workshops that offer specific CMC training or collaborative and cooperative learning techniques, or a reduced teaching load to allow time for CMC development work.
4. Colleges should provide a comprehensive program to combat the resistant attitudes of non-CMC-using faculty and thus increase the relatively slow adoption rate of classroom CMC. The adoption program should make faculty aware of the many advantages of CMC, such as speed, cost-effectiveness, flexibility, and convenience (p. 36).
Holden and Mitchell (1993) concluded that the role of CMC in colleges will
likely continue to alter parts of the instructional process. By allowing students and
faculty to communicate when and where it is convenient, CMC can make the teaching
and learning process more flexible and instruction more effective. CMC research has
documented many positive instructional benefits of using CMC including student
involvement (Althaus, 1997; Berge & Collins, 1995; Chapman, 1998; Clay-Warner &
Marsh, 2000; Everett & Ahern, 1994; Harasim (1990); Holden & Mitchell 1993;
McComb, 1994). Student involvement has been extensively studied, with several
theories emerging as important contributors to educational research. The following
section describes three theories of student involvement and the results of studies
grounded in them.
31
Student Involvement and Academic and Social Integration
Researchers have been interested in student experiences while in college,
particularly the concept of student involvement, for a long time. The 1984 Study Group
on the Conditions of Excellence in American Higher Education defined the importance of
student involvement as: “The more time and effort students invest in the learning process
and the more intensely they engage in their own education, the greater will be their
growth and achievement, their satisfaction with their educational experiences, and their
persistence in college, and the more likely they are to continue their learning” (p. 17).
This study urged colleges to create institutional conditions that promoted student
involvement and enhanced the learning opportunities for their students. Kuh et al. (1991)
defined an “involving college” as one in which “…students and faculty are actively
engaged in the life of the campus community and with one another in teaching and
learning…where students are expected to take, and do assume, responsibility for their
learning and personal development” (p. 29).
Scholars who study the impact of college on students agree that what happens
outside the classroom also strongly contributes to students’ college experience. Out-of-
classroom experiences include participation in clubs or social organizations, student
government, discussions with faculty outside the classroom, volunteerism, learning about
other cultures, meeting with other students to study, or utilizing campus facilities.
Participation in extracurricular activities and conversations with peers have been
positively related to satisfaction and persistence (Astin, 1977; Lundeberg & Moch, 1995;
Pascarella & Terenzini, 1991). Moreover, out-of-class experiences have been positively
linked to leadership skills, self-confidence, self-awareness, social competence and an
32
appreciation for diversity (Feldman & Newcomb, 1969; Friedlander & MacDougall,
1992; Kuh, 1993). Student involvement has been extensively studied, with several
theories emerging as important contributors to educational research. These theories will
be presented and discussed in more detail.
Tinto’s Student Integration Model
Tinto (1975) developed a model of persistence/withdrawal behavior that built
upon the earlier works of Durkheim (1951) and Spady (1970, 1971). Durkheim studied
social factors involved in suicide in the early 1950s. Spady drew from this Durkheim
theory as he studied student attrition twenty years later. Spady identified situations and
reasons that caused people to feel isolated from their environment and disconnected to
the extent that removing themselves from that environment seemed to be the most
appropriate action. Building upon the research of Durkheim and Spady, Tinto posited a
model of student and college integration that provided a theoretical foundation for student
involvement and retention (Tinto, 1975; Tinto, 1982; Tinto, 1987; Tinto, Russo, &
Kadell, 1994).
33
Figure 1. Tinto’s 1975 theoretical schema: The 13 primary propositions
Tinto (1975) stated that students enter college with varying background attributes,
varying personal educational expectations, and institutional commitments. While in
college, students interact with both the academic system and the social system of the
institution. Tinto described the college experience as a longitudinal process of
interactions in the academic and social systems of the institution, which influences
students’ goals and commitments towards their continuance at that school and ultimately
towards completion. “It is the interplay between the individual’s commitment to the goal
of college completion and his commitment to the institution that determines whether or
not the individual decides to drop out” (Tinto, 1975, p. 96). Additionally, Tinto (1975)
stated that there should be a “reciprocal functional relationship between the two modes
Student Entry Characteristics
Goal Commitment
Social Integration
Institutional Commitment
Institutional Commitment 2
Academic Integration
Goal Commitment 2
Persistence
34
[academic and social] of integration” (p. 92). Tinto (1975) believed that students could
become too involved in the academic domain of the school, with too little integration into
the social domain, or so socially involved that grades would suffer.
A critical element of the Tinto model is the out-of-classroom interaction with
faculty. Berger and Braxton (1998) stated that “Both Spady (1971) and Tinto (1975)
suggest that interaction with faculty not only increases social integration and therefore
institutional commitment, but also increases the individual’s academic integration” (p.
109). Tinto (1987) summarized his theory of student integration and the ensuing
retention. The quality of faculty-student interactions and the students’ integration into
the institution’s social and intellectual life are central factors in student attrition…but
institutions must focus upon student involvement and their social and intellectual growth.
When this is achieved, “enhanced student retention will naturally follow” (p. 5).
Tinto (1975) also proposed that students come to college with expectations. If
these expectations are unmet, the disenchantment of students could hinder academic and
social integration, which could subsequently impact institutional and goal commitments,
and ultimately, student departure. Stage (1990) believed that “Today few would question
that students’ commitment, academic integration, and social integration are crucial to
their academic success” (p. 250). The Tinto model stated that “Other things being equal,
the higher the degree of integration of the individual into the college system, the greater
will be his commitment to the specific institution and to the goal of college completion”
(p. 96). According to Tinto’s theory, departure results from interactions among
individuals within an institution over a period of time. Tinto hypothesized that departure
is determined by whether an individual’s commitment and intentions match the
35
institution’s academic and social systems. The theory suggests that a longitudinal
process of interactions between an individual with given attributes, skills, and
dispositions (commitment and intentions) and other members of the institution’s
academic and social systems either further one’s social and academic integration (thus
enhancing the likelihood of not departing from the institution until degree completion) or
leads to insufficient integration (increasing the likelihood of one’s departure).
Tinto’s specific approach has been used by many researchers (Aitken, 1982;
Baumgart & Johnstone, 1977; Bean, 1980, 1982; Braxton & Brier, 1989; Braxton, Brier,
& Hossler, 1988; Cabrera et al., 1992; Munro, 1981; Pascarella & Chapman, 1983a,
1983b; Pascarella, Duby, & Iverson, 1983; Pascarella & Terenzini, 1979, 1980, 1983;
Terenzini & Pascarella, 1977, 1980; Terenzini et al., 1985), most of whom appear to
agree that Tinto’s model provides the best explanation of student departure from
institutions of higher education. Berger and Braxton (1998) asserted that although there
is not a clear definition of social integration, which is one of the core concepts of Tinto’s
model, several studies have conceptualized social integration as peer group relations and
faculty relations (Braxton & Brier, 1989; Pascarella & Terenzini, 1980; Peters, 1992).
Evidence also supports Tinto’s notion that organizational attributes, not size, selectivity,
or control but the ways in which students experience the organizational behavior of an
institution, impact social integration (Braxton & Brier, 1989).
Pascarella (1985) utilized Tinto’s integration model and factored in institutional
characteristics such as the size of the institution, selectivity of the institution, student-
faculty ratio and percentage of residential students. Pascarella (1985) determined that if
these pre-college characteristics are controlled, the frequency and quality of out-of-
36
classroom faculty interactions positively influences students’ cognitive development.
Research by Elkins, Braxton, and James (1998) showed that social integration “positively
influences subsequent institutional commitment, which, in turn, positively affects the
likelihood of student persistence in college” (p. 18). Moreover, Peters (1992) stated that
the university “should not simply be a bureaucratic apparatus…from which the student
regularly receives written assignments, but should be a living institution of which he
himself is a part” (p. 264). Tinto’s Student Integration Model of college student
persistence/withdrawal has provided a strong theoretical framework for studying
traditional students of four-year, residential, selective universities even though it does not
address external factors impacting college students.
Braxton, Sullivan, and Johnson (1997) conducted a comprehensive review of
empirical studies utilizing Tinto’s model and found that by early 1995, Tinto’s model had
been cited in published research in excess of 400 times and in approximately 170
dissertations. Braxton, Sullivan, and Johnson (1997) reviewed empirical studies utilizing
the Tinto model and Tinto’s 15 testable propositions:
1. Student entry characteristics affect the level of initial commitment to the institution.
2. Student entry characteristics affect the level of initial commitment to the goal of graduation from college.
3. Student entry characteristics directly affect the student’s likelihood of persistence in college.
4. Initial commitment to the goal of graduation from college affects the level of academic integration.
5. Initial commitment to the goal of graduation from college affects the level of social integration.
6. Initial commitment to the institution affects the level of social integration. 7. Initial commitment to the institution affects the level of academic integration. 8. The greater the level of academic integration, the greater the level of
subsequent commitment to the goal of graduation from college.
37
9. The greater the level of social integration, the greater the level of subsequent commitment to the institution.
10. The initial level of institutional commitment affects the subsequent level of institutional commitment.
11. The initial level of commitment to the goal of graduation from college affects the subsequent level of commitment to the goal of college graduation.
12. The greater the level of subsequent commitment to the goal of college graduation, the greater the likelihood of student persistence in college.
13. The greater the level of subsequent commitment to the institution, the greater the likelihood of student persistence in college.
14. A high level of commitment to the goal of graduation from college compensates for a low level of commitment to the institution, and vice versa, in influencing student persistence in college.
15. A high level of academic integration compensates for a low level of social integration, and vice versa, in influencing student persistence in college.
Note: Tinto added the final two propositions, which are not integral to the longitudinal sequence of the others in accounting for student departure. (p. 109). Three of these propositions specifically pertain to academic and social integration.
These propositions include: number 8, “The greater the level of academic integration, the
greater the level of subsequent commitment to the goal of graduation from college;”
number 9, “The greater the level of social integration, the greater the level of subsequent
commitment to the institution;” and number 15, “A high level of academic integration
compensates for a low level of social integration, and vice versa, influencing student
persistence in college.”
Braxton et al. (1997) summarized their findings for Proposition Number 8, “The
greater the level of academic integration, the greater the level of subsequent commitment
ot the goal of graduation from college,” and described academic integration and goal
commitment as an “expected relationship” (p. 122). Of eight multi-institutional tests
reviewed, four empirically upheld this proposition (Braxton, Vesper, & Hossler, 1995;
Munro, 1981; Cash & Bissel, 1985; and Williamson & Creamer, 1988). Fives tests of
38
this proposition suggested moderate support (Allen, 1986; Cabrera et al., 1992; Cabrera,
Nora, & Castaneda, 1992; Pascarella & Terenzini, 1983; Terenzini, Pascarella,
Theophilides, & Lorang, 1985). While four-year institutions offered strong affirmation
of this proposition, studies at two-year colleges offered moderate support (Williamson &
Creamer, 1988). Academic integration and subsequent goal commitment received
moderate support in studies at commuter institutions, with three of seven assessments
supporting this association (Allen, 1986; Cabrera et al., 1992; Cabrera, Nora, &
Castaneda, 1992). Stage (1988) found no statistically reliable support for the theorized
influence of academic integration on goal commitment for either male or female students,
and Pavel (1991) reported no support for Native American students.
Proposition Number 9, “The greater the level of social integration, the greater the
level of subsequent commitment to the institution” had mixed results in the Braxton, et al.
(1997) summary. Four of seven assessments showed support (Allen, 1986; Allen &
Nelson, 1989; Cabrera et al., 1992; Cabrera, Nora, & Castaneda, 1992). Two studies
utilizing male students upheld this proposition (Pascarella et al., 1986; Stage, 1988), and
one of two assessment made of female students supported this concept (Stage, 1988).
Multi-institutional assessment made at two-year college settings and at commuter
institutions offered moderate support for this proposition (Pascarella, Smart, & Ethington,
1986). Only one assessment utilizing specifically Native American and Alaskan students
was reviewed that offered no support for this proposition (Pavel, 1991).
Braxton, et al. (1997) found aggregated support for Proposition Number 15,
“Academic integration and social integration are mutually interdependent and reciprocal
in their influence on student persistence in college.” This proposition was confirmed in
39
three of four assessments of single institutions (Pascarella & Terenzini, 1979, 1983). One
of two applications of this proposition utilizing male students support this concept
(Pascarella & Terenzini, 1983), and both assessments utilizing female students offered
vigorous support (Pascarella & Terenzini, 1979, 1983). Cabrera et al. (1992) performed
this assessment at a commuter school and found that academic and social integration are
related to one another in a statistically reliable way.
Tinto (1982) refined and updated his theory several times in subsequent years.
He pointed out that this model “explains only certain modes or facets of attrition
behaviors and addresses characteristic behavior of individuals only as they interact with
institutions” (p. 688). Tinto recognized that this model has limitations in distinguishing
differences pertaining to gender, race, and socioeconomic status and stated “recognizing
theoretical limits should not…constrain us from seeking to improve our existing models
or replace them with better ones” (p. 689).
The extensive review of the Tinto Model by Braxton et al. (1997) concluded that
Tinto’s primary proposition might still be of value to college and universities toward
understanding the college involvement/departure process. The reviewers concurred with
Tinto’s suggestion that researchers should seek to revise and improve models utilized for
institutional research. Braxton et al. (1997) stated “through a greater understanding of
the departure puzzle, individual colleges and universities can better manage their
enrollments. Moreover, scholars will come to better understand not only this phenomena,
but also will come to have a window on other facets of the college student experience”
(p. 159).
40
Tinto (1975) proposed a prospective model of student persistence which considers
a comprehensive set of background and psychosocial factors. Central to the model is the
impact academic and social integration has on goal and institutional commitment and on
the subsequent decision to persist or withdraw from the institution. A number of
validation studies, which generally support the model, have been conducted within four-
year college and university settings. The few efforts to validate the model within
community colleges yield evidence supporting the importance of academic integration,
though evidence of the connection between social integration and persistence has been
mixed.
Because many of the studies using Tinto’s (Tinto et al., 1994) model have focused
on 4-year institutions, it is important to know the difference, if any, between those
findings and the ones related to 2-year institutions. Public 2-year colleges have a
different student population than 4-year colleges and universities. Public 2-year college
students are nontraditional and nearly all commute. Ross (1992) conducted a study to
provide insight into 2-year student departure. Descriptive data indicated that
developmental studies students had a “lower level of satisfaction with their academic
experiences” (p. 70). Ross suggests that these students “focus on academics for the first
year and allow themselves the rewards of social involvements as they enter college-level
work” (p. 72).
Napoli & Wortman (1996) attempted to validate Tinto’s (Tinto et al., 1994) model
in a community college setting. They conducted a meta-analysis of community college
academic and social integration literature to assess the impact and relative importance of
academic and social integration on the persistence/withdrawal behavior of community
41
college students. Their results indicated that academic integration has significant and
beneficial effects on both term-to-term and year-to-year measures of persistence. Social
integration was also observed to be significantly and positively linked to term-to-term
persistence, but less strongly related to year-to-year persistence (Napoli & Wortman,
1996). The results indicated that both were important factors for students deciding to
remain in or withdraw from school (Napoli & Wortman, 1996).
Because most of the research conducted on Tinto’s (Tinto et al., 1994) model had
been done at 4-year institutions, Burnett (1996) decided that she would investigate social
integration at 2-year institutions. She found that “student retention appears to be
unrelated to participation or lack of participation in co-curricular activities” (p. 47).
Students, on the other hand, seem to think that participation in cocurricular activities will
help them achieve academically and “give them grater confidence in their ability to
transfer to another institution” (p. 48). The important element discovered was that
students who participated in cocurricular activities appeared to be “more closely connect
to, and identified with, the college” (p. 47).
Borglum & Kubala (2000) also investigated how Tinto’s model of retention
(Tinto, Russo, & Kadel, 1994) could be applied to 2-year institutions. Their research
explored academic and social integration and their effects on student withdrawal rates as
well as the effect of background skills on withdrawal rates. Study participants were 462
degree-seeking second-semester community college students who completed a survey
regarding their satisfaction with the academic and social climate of the community
college. Performances on Computer Placement Tests were correlated with withdrawal
rates to determine the association between background skill levels and withdrawal
42
patterns. No correlation was found between academic and social integration and
withdrawal rates. However, findings did show that the poorer the Computer Placement
Tests performance, the more likely students were to withdraw from courses. Myers
(2001) conducted a research study to determine the influences on persistence of
community college technical degree seekers based on Tinto’s model of institutional
departure. The research results showed that age, social interactions and gains in career
development had positive effects on the persistence of technical degree seekers and
provided insight into the importance of the social integration opportunities and their
related impact on community college students.
Keeping students enrolled is one of the primary challenges facing colleges in this
time of financial constraints. Community colleges, where only a third of all beginning
full-time students earn associate degrees or certificates, are especially cognizant of this
challenge. Community college administrators are aware of the many hurdles they face in
trying to retain more students. Most community college students commute, are older and
generally poorer than four-year college students, and have multiple obligations outside of
school including careers, families, and volunteer work. All of these factors greatly limit
the time and energy they can devote to college work. Some community colleges are
succeeding in increasing both student learning and retention by structuring their
educational programs in new ways that stress the importance of academic and social
community in students’ lives. Showing particular promise in this direction is the
implementation of learning communities or collaborative learning programs that enable
faculty and students to work together as active participants in the learning process
(MacGregor, 1991; Matthews, 1994; Tinto, Russo, & Kadel, 1994). In an attempt to
43
better understand how community colleges can make use of such programs, Tinto, Russo,
& Kadel (1994) took an in-depth look at the Coordinated Studies Program (CSP) at
Seattle Central Community College, Washington and concluded that community colleges
can successfully involve students in education, thus enhancing their learning and
increasing their persistence (Russo, 1993; Tinto & Russo, 1993; Tinto, Russo, & Kadel,
1994).
Astin’s Student Involvement Model
Astin proposed his student involvement model in 1977. Astin (1984) described
student involvement as “the amount of physical and psychological energy that the student
devotes to the collegiate experience. Thus, a highly involved student is one who, for
example, devotes considerable energy to studying, spends much time on campus,
participates actively in student organizations, and interacts frequently with faculty
members and other students” (p. 297). Astin (1984) defined the term “involvement” as
an active term and outlined the following list of descriptive verbs that he felt were
appropriate to his model: “engage in, incline toward, join in, partake of, participate in,
show enthusiasm for, take part in, undertake” (p. 298). He also drew from the Freudian
concept of “cathexis” (p. 198), which stated that people invest psychological energy in
persons and objects outside themselves, such as their friends, families, schools, and jobs.
Astin (1984) believed that “effort” was a narrower definition of involvement, yet very
closely related to the concept. He believed that his student involvement research would
be useful to researchers, college administrators and faculty for creating effective learning
environments.
44
Astin (1984) described the relationship between student involvement and learning
in five postulates:
1. Involvement is the investment of psychological and physical energy in an activity.
2. Students invest varying amounts of energy in activities. 3. Involvement has quantitative and qualitative features. A student could belong
to a number of clubs, or utilize the library on a regular basis. 4. The benefits derived from involvement are a function of the quality and
quantity of effort expended. 5. The effectiveness of any educational policy or practice is related to the extent
to which it encourages students to take initiative and become actively engaged in appropriate activities (p. 298).
Astin (1984) described his model of involvement as requiring active student
participation and posited that the most precious institutional resource may be the
student’s time. Moreover, Astin (1984) suggested that the achievement of developmental
goals is a “direct function of the time and effort they (students) devote to activities
designed to produce these gains” (p. 301). Based upon their exhausted review of research
literature, Pascarella and Terenzini (1991) offered the following statement concerning
Astin’s (1977) propositions:
“Astin offers a general dynamic, a principle, rather than any detailed, systematic description of the behaviors or phenomena being predicted, the variable presumed to influence involvement, the mechanisms by which those variables related to and influence one another, or the precise nature of the process by which growth or change occurs” (p. 51). The question of whether or not Astin’s propositions frame a theory will be left to
future researchers. However, Astin’s work did provide the stimulus for numerous studies
evaluating the relationship between student involvement and learning. He posited that
students should actively engage in the opportunities presented at the college environment,
including meeting and talking with other students, participating in student organizations,
45
and interacting with faculty outside the classroom. Astin (1985) stated that “frequent
interaction with faculty members is more strongly related to satisfaction with college than
any other type of involvement or indeed any other student or institutional characteristic”
(p. 149). Interaction with faculty and advisors, especially in the form of career advice,
has been shown to contribute to persistence and is also perceived as creating an
environment of support and encouragement (Kim & Alvarez, 1995; Rayman & Brett,
1995). Several studies utilizing Astin’s (1977) model found that students participation in
extracurricular activities enhanced their chance of persistence and provided higher
college experience satisfaction (Fitch, 1991; Miller & Jones, 1981). Extracurricular
student involvement also contributed to valued college outcomes (Bowen, 1977;
Chickering & Riesser, 1993; Kuh, 1981, 1993; Pascarella & Ternezini, 1991), greater
maturity gains and enhanced career decision-making skills (Winter, McClelland, &
Stewart, 1981), as well as general skill gain (Cousineau & Landon, 1989).
The student population initially described by Astin (1977) was predominantly
traditional-aged, residential college students. Astin (1984) conceded that educators are
competing with “other forces” in the finite time of students such as jobs, families, friends,
and outside activities. Astin (1984) also asserted that these other interests “represent a
reduction in the time and energy the student has to devote to education development” (p.
301). These outside factors create an even more formidable challenge for institutions with
predominantly commuter, non-traditional aged students. Astin acknowledged that
commuter students were much more likely to withdraw from college than were
residential students. Utilizing Astin’s model, the Maryland Longitudinal Study (1987)
found that there was not a significant difference between persisters and non-persisters in
46
the area of academic involvement; however, the non-persisters were “very much”
involved in their work experience, interactions with fellow employees, and the work
itself. As stated previously, much of the research utilizing Astin’s model has been based
upon student population homogeneity. According to Pasacrella and Terenzini (1998), the
demographic projections for the future suggest that student population heterogeneity will
be the trend.
C. Robert Pace and Quality of Effort
Pace (1979, 1984, 1998) was convinced that the breadth and scope of student
involvement was essential to the quality of undergraduate education. According to Pace
(1998), “Prior research had not included what turns out to be the most influential
variable—the quality of effort that students themselves invest in using the facilities and
opportunities for learning and development that exist in the college setting” (pp. 18-19).
He defined quality of effort as “voluntary behavior, initiative, (or) personal investment
that students are making for their own higher education” (p. 31). Pace believed that
quality of effort could be measured and this knowledge would contribute to the
understanding of student development and learning.
Pace (1998) theorized that education is “both a process and a product” (p. 28) and
that it was important to measure the quality of the process. He stated that while most
educational programs judge themselves on the basis of “product,” or knowledge acquired
and demonstrated skills, that “process” should be also be examined and considered.
According to Pace (1982), “The quality dimension is the level of cognitive effort, with
the higher levels contributing more solidly to the acquisition of knowledge and
47
understanding” (p. 4). Pace developed a theoretical model (Figure 2) to illustrate his
theory of growth and development.
Figure 2. Pace’s (1979) student development and college impress model (p. 126)
In this model, Pace (1979) presented three basic propositions:
1. The college experience consists of the events one encounters in college. 2. The nature, or meaning of these events, experiences, and encounters, is
influenced by certain features of the environment and by the amount, breadth, and quality of the effort students exert.
3. The combined influences of environment and effort lead to student development and college impress.
Unlike the developers of the previous models discussed, Pace operationalized his
model by developing an instrument called the College Student Experiences Questionnaire
(CSEQ) to measure college students’ involvement. This instrument measured the quality
Effort and Environment Amount, scope, and quality of effort students invest in using the facilities and opportunities Press of the college environment Academic-scholarly emphasis Esthetic-expressive emphasis Critical-evaluative emphasis Vocational emphasis Nature of relationships in the college environment with peers, with faculty members, with administrative offices
Exit Student development and college impress as indicated by differences between criterion scores at entrance and exit Self-ratings of progress, benefits, satisfactions Attitudes toward the college and, subsequently, evidence from alumni studies of continued interests, continued learning, and so on
College Experiences and Events Salient facilities and opportunities Classrooms Library Laboratories Residence units Student union cultural facilities Cultural facilities Athletic facilities Recreational facilities Clubs and organizations Student acquaintance Faculty contacts Experience in writingSelf-understanding
Entrance Criterion measures at entrance Knowledge Critical thinking Other skills Interests Values Personal traits and so on
48
of effort put forth by students outside of the classroom, including interaction with faculty
and other students, use of the facilities, and frequency of participation in cultural and
recreational campus opportunities. Students taking the instrument were also asked to
estimate how much progress, or gain, they had made towards a list of educational goals.
Students were presented with a list of experiences in various categories and asked to
respond to each activity by indicating both frequency and level of participation. In order
to obtain high “scores”, students must participate in an activity in a highly involved
manner, and not simply attend numerous activities with a low degree of participation.
The score obtained would then reflect the quality of effort, and not just frequency.
Pace (1979, 1984, 1998) posited that activities requiring greater effort on the part
of students should be more meaningful and educative. Research utilizing Pace’s model
confirmed that the amount and level of effort put forth by students was a very strong
indicator of the quality of their educational experience. The concept of student
involvement as it relates to persistence, retention, and overall satisfaction was widely
accepted by researchers studying college students at four-year institutions
(Abrahamowicz, 1988; Braxton et al., 1995; Cabrera et al., 1992; Davis & Murrell, 1990;
Dowaliby, Garrison, & Dagel, 1993; Kuh, 1995; Munro, 1981; Pace, 1981, 1982, 1996;
Pascarella & Terenzini, 1979; Wolfe, 1993).
Community College Student Experiences Questionnaire (CCSEQ)
Pace (1998) recognized that higher education enrollment has shifted from
predominantly residential, traditional-aged students to older, part-time, non-white
students. He stated that higher education must consider the needs of this diverse student
population, and the changing conditions on campuses. The College Student Experiences
49
Questionnaire (CSEQ) developed by Pace (1979) that operationalized the constructs of
his theoretical perspective was modified by Friedlander, Pace, and Lehman (1990) to
focus on student involvement at two-year colleges. While many of the items are similar
to the CSEQ, the Community College Student Experiences Questionnaire (CCSEQ)
recognized that students at two-year schools differ from the traditional-aged residential
full-time students at many four-year schools. In addition to being non-residential and
typically older, students at two-year schools frequently have more work and family
responsibilities that limit campus involvement outside the classroom.
Friedlander, Murrell and MacDougall (1993) examined how the CCSEQ could
aid community college administrators in understanding and promoting student
involvement and achievement. According to Friedlander et al. (1993), the CCSEQ
evaluates in-class and out-of-class activities as well as measures the level of progress
students feel they have made in achieving educational outcomes. Moreover, the
instrument records the amount and breadth of students’ experiences as they engage in the
resources and opportunities at the college setting, e.g., the library, interactions with
faculty, staff and administrators, social opportunities, cultural opportunities, and the
diversity of peer interactions. The CCSEQ endorses the concept that the greater the
involvement of students, the greater progress students report making while enrolled at
that institution. According to Murrell and Glover (1996), “Knowledge about what
learners do and how they respond to an institution’s efforts to provide a rich educational
environment can add an important dimension in determining the impact of the
educational experience” (p. 199).
50
Student effort, the campus environment, student age, and full- or part-time
enrollment status have significant effects on outcomes for university students. Glover &
Murrell (1998) employed this same set of independent variables to predict how a sample
of community college students perceived their gains in general education and personal
and social development. Using data from the CCSEQ for 4210 students, they found that
quality and quantity of student effort as well as a positive perception of the campus
environment were significant predictors of community college students perceived gains
in general education and personal and social development. The campus environment
interacting with full- or part-time enrollment status were also significant predictors of
gains in general education. What students do in terms of their involvement in community
college does appear to make a difference in perceptions of how much they learn (Glover
& Murrell, 1998).
Researchers have utilized the CCSEQ in the past decade to evaluate the effects of
student involvement and quality of effort (Douzenis, 1994, 1996; Douzenis & Murrell,
1992; Friedlander & MacDougal, 1992; Glover, 1996; Knight, 1992, 1994; Polizzi &
Ethington, 1998; Preston, 1993, 1998; Stewart, 1995; Swigart & Ethington, 1998;
Sworder, 1992). Studies using the CCSEQ have also confirmed that quality and quantity
of student effort contributed significantly to gains in personal and social development,
goal commitment, and perceived knowledge gain (e.g., Ackermann, 1990; Douzenis,
1996; Friedlander & MacDougall, 1992; Knight, 1994; Polizzi & Ethington, 1998;
Preston, 1993). Consistent with previous research on non-traditional and commuter
students, measures of academic activities proved to be important predictors of the amount
of progress reported by students towards educational objectives (Douzenis, 1996; Smith,
51
1993; Stewart, 1995). Studies utilizing the CCSEQ confirmed that community college
students spend limited time on campus outside of their classroom experiences (Douzenis,
1994; Knight, 1992; Lehman, Ethington, & Polizzi, 1995).
Polizzi and Ethington (1998) analyzed CCSEQ student responses from a national
dataset collected from 1990-1994. Their study examined differences in the quality of
student participation in college experiences and the perceptions of gains towards career
preparation. Polizzi and Ethington (1998) analyzed the relationships of these two
variables on four vocational programs: technical/communication, trade/industry,
business, and health. This study confirmed that “the quality of effort devoted to
vocational skills was the only variable significant for all four groups, and that the effort
devoted to vocational skills affected the greatest career preparation gains in all four
groups” (p. 46). This study accentuated the importance of providing students with
opportunities requiring active participation. Studies such as this reinforce the concept of
student involvement and interaction.
Faith and Murrell (1992) found strong similarities between black and white
students in the areas of social and academic quality of effort in a study utilizing student
responses at four two-year colleges in Tennessee. Swigart and Ethington (1998)
examined CCSEQ response from a national database of 15,263 community college
students to determine if students from various ethnic groups experienced different
patterns of growth and development during their community college tenure. Their results
showed that although there were some “fine differences” between some of the ethnic
groups, overall there was very little of the variability in estimates of gains that could be
attributed to ethnic differences. Mijangos (2001) studied the nature and dynamics of
52
Hispanic/Latino(a) students’ collegial experience while enrolled in and attending Iowa
community colleges. The findings indicated that enrollment of Iowa Hispanic/Latino(a)
students is increasing and the students are having positive learning and developmental
experiences as indicated by their estimate of gains scores and quality of effort scales on
the CCSEQ.
The CCSEQ has also been used by community colleges to gather information on
student demographics, quality of effort, perceived gain, satisfaction, and courses taken
(Ackermann, 1990; Moss & Young, 1995; Summary, 2000; Sworder, 1992). Tinto’s
theory of departure emphasizes the importance of the match between student intentions
and commitments and the institution’s academic and social systems. Thus, it is important
that the stewards of those systems (administrators, counselors, and faculty) understand
the perceptual world of students and relate their interactions, programs, and policies to
that perceptual world. The CCSEQ survey results provide constructs that community
colleges can use as outcome indicators for evaluating progression toward goal-
attainment. Institutional processes, programs, and policies can then be examined for their
effectiveness and suggestions made for improvement.
Pace (1998) acknowledge the addition of technology to the realm of higher
education, e.g., computers, the World Wide Web, Internet, and other delivery methods.
He believed that higher education must adapt in order to reach those students utilizing
technology as an integral part of their collegiate experience, and, begin developing
“better questions” for future survey (p. 32). In response to this overwhelming shift to
technology in higher education, particularly at two-year colleges, a set of eight
technology questions was added to the CCSEQ Activities Section in 1999.
53
Summary
One of the pervasive themes in literature is the challenge for community colleges
to embrace the future and adapt curriculum and delivery methods so as to best meet the
needs of their students and communities. Community colleges have used computer
technology, specifically computer-mediated communication, to help them expand access
to higher education, respond to diverse student learning styles, provide vehicles for active
student involvement, and reduce cost. The research on computer-mediated
communication has found that student involvement is one of the many positive
instructional benefits of using CMC. However, this extensive literature review revealed
no research studies that relate CMC to the academic and social integration of college
students. Because computer technology, specifically computer-mediated communication,
has proliferated within teaching and learning in higher education and because of the
importance of academic and social integration, this study was significant in documenting
through quantitative data analysis the impact that computer-mediated communication had
on the academic and social integration of community college students.
54
CHAPTER 3
METHODOLOGY
This chapter discusses the components of the research methodology. The
research design and sample are discussed. Also, the data collection procedures and
instrument, Community College Student Experiences Questionnaire (CCSEQ), are
presented and variables pertinent to this study are discussed. Also, procedures for
analysis of data, testing of research questions, and expected results are described.
Research Design and Sample
This research focused on the following research question: Does computer-
mediated communication have an impact on the academic and social integration of
community college students? The research was conducted with a pretest-posttest
control-group experimental design using the guidelines and steps outlined by Gall, et.al.,
(1996). Step 1 was to randomly assign research participants to the experimental and
control groups. Step 2 was to administer the CCSEQ as the pretest to both groups. Step
3 was to administer the treatment (computer-mediated communication) to the
experimental group but not to the control group. And Step 4 was to administer the
CCSEQ as the posttest to both groups (Gall, et.al., 1996).
In this control-group design the goal was to keep the experiences of the
experimental and control groups as identical as possible, except that the experimental
group was exposed to the experimental treatment. The experimental treatment for this
study was computer-mediated communication. If extraneous variables brought about
55
changes between the pretest and posttest, these would be reflected in the scores of the
control group. Thus, the posttest change of the experimental group beyond the change
that occurred in the control group was safely attributed to the experimental treatment
(Gall, et.al., 1996).
The pretest-posttest control-group experimental design effectively controlled for
the eight threats to internal validity originally identified by Campbell and Stanley (1981):
history, maturation, testing, instrumentation, statistical regression, differential selection,
experimental mortality, and selection-maturation interaction. Nevertheless, the external
validity of this design might have been affected by an interaction between the pretest and
the experimental treatment. That is, the experimental treatment might have produced
significant effects only because a pretest was administered. To help control for the
external invalidity of the interaction of testing with the experimental treatment, the
research participants taking the pretest and posttest were not informed of the research
purpose until after the posttest was taken (Gall, et.al., 1996).
Community college students enrolled in the researcher’s Chemistry 1406 course
at a North Central Texas community college were used to create the sample for this
study. The students were randomly assigned to the control and experimental groups
before the official day of record, completed the course, and took the CCSEQ as a pretest
and posttest during the 1st and 15th weeks of the semester as a part of their research
participation grade (Gall, et.al., 1996). If students chose not to participate in the survey,
their research participation grade was not affected. Also, the pretests and posttests were
coded so that the research participants remained anonymous. Therefore, the pretests and
posttests were taken voluntarily and anonymously.
56
Based on these criteria the total sample consisted of 88 community college
students enrolled in the researcher’s Chemistry 1406 lecture and laboratory. The control
group totaled 47 students. And the experimental group totaled 41 students. To control
for potential sources of external invalidity, the researcher was the instructor of record for
the control and experimental research participants in lecture and laboratory. The study by
Everett and Ahern (1994) concluded that students should receive some sort of outside
reward for participating in “extra-classroom” activities to increase participation rate.
Therefore, to encourage active participation in both groups, 10 percent of the overall
course grade was designated as a research participation project. The control group was
given a written research assignment due by week 15 (one week before the end of the
semester). And the experimental group was given the task of actively participating in all
aspects of the computer-mediated communication treatment including posting to the
weekly asynchronous bulletin board discussions, listservs, and emails as well as
contributing to the weekly synchronous chat.
The control group and experimental group research participants were enrolled in a
traditional Chemistry 1406 course. To keep the course content consistent for all students,
the control group was given a packet that contained the same course information
documents, reviews, etc. that the experimental group had access to through the online
computer-mediated communication. Only the experimental group research participants
were given the login and password information necessary to access the Blackboard
Learning System used for the computer-mediated communication treatment.
The experimental treatment, computer-mediated communication, was facilitated
by the Blackboard Learning System (www.blackboard.com). The Blackboard Learning
57
System is a Web-based server software platform that offers course management,
customization and interoperability architecture, and advanced integration and system
management. The Blackboard teaching and learning environment is widely regarded as
the industry’s leading course management system on the basis of ease of use, widespread
adoption, pedagogical flexibility, and breadth of intuitive features and functions.
Blackboard’s online teaching environment includes four primary areas of functionality:
Content Management, Communication, Assessments, and Control.
The Blackboard Learning System includes the following headings:
Announcements, Course Information, Staff Information, Course Documents,
Assignments, Books, Communication, Chat, Discussion Board, Groups, External Links,
and Tools. The Content Management functionality allowed for the following: creating
folder structures to organize course content; posting announcements, course materials,
assignments, links, and faculty and student profiles; and incorporating existing
instructional content by uploading existing files. The Announcements section is
specifically designed for only the instructor to be able to communicate with all students
in the experimental group. This section began with a letter of welcome and some advice
for learning and studying chemistry. During the semester other announcements included
important reminders about due dates and test dates as well as words of encouragement.
The Course Information section included the course syllabus, course information
documents, and laboratory information. The Staff Information section included the
instructor’s picture, resume, and list of publications. The Course Documents section
provided links to all of the class presentations (Powerpoint lectures) divided by chapters
for students to preview and review outside of class. The Assignments section contained
58
major exam reviews 1-4, lecture final exam review, laboratory midterm review, and
laboratory final review.
The Communication functionality allows for instructor and student collaboration
with asynchronous discussion boards and synchronous chat tools. Communication was
enhanced by enabling collaborative learning through discussion boards, email, real-time
virtual classroom interaction (chat), and group communication (listservs). A number of
weekly forums within the discussion board were created throughout the 16 week
semester, each with a distinct set of properties to allow for various pedagogical
approaches to managing interaction. Also, weekly online text chats were conducted with
a shared whiteboard enhancement feature especially helpful with chemistry discussions.
The Assessments functionality allows for creating and administering quizzes and
surveys. This particular function was not used so that all assessment for the control and
experimental groups were conducted in the same manner. All assessments was given
during the assigned lecture period.
And the Control functionality allows the instructor to monitor, control, and
customize the entire course Web site from a Web browser. The computer-mediated
communication was managed through Blackboard’s robust and easy-to-use course
control panel. After randomly selecting the experimental participants, they were easily
enrolled into the Blackboard computer-mediated communication system (Yaskin &
Gilfus, 2002). See Appendix B for a presentation of the computer-mediated
communication facilitated through the Blackboard Learning System.
59
Data Collection Procedures and Instrument (CCSEQ)
The data utilized in this study of computer-mediated communication and its
impact on academic and social integration of community college students were collected
through the usage of the Community College Student Experiences Questionnaire
(Friedlander, Pace, & Lehman, 1990). The College Student Experiences Questionnaire
(CSEQ) developed by C. Robert Pace in 1979 was modified by Friedlander, Pace, and
Lehman in 1990 to create the Community College Student Experiences Questionnaire
(CCSEQ). The CCSEQ has been examined critically and empirically for its reliability
and validity. It is designed to measure the amount, scope, and quality of effort students
put into their college experience and the amount of progress students estimate they have
made toward a set of important educational goals (Friedlander, Pace, & Lehman, 1991).
The item coherent of each scale of the CCSEQ is very high. The items within a scale
correlate positively among themselves and reveal a range in the frequency and level of
involvement. A complete copy of the instrument is found in Appendix A.
The CCSEQ addresses issues germane to students enrolled in two-year schools
including varying ages, ethnicities, educational goals, educational backgrounds, family
responsibilities, and socioeconomics. Pace described the CCSEQ and its relevance to
community colleges in this way:
The CCSEQ is an instrument designed to fit the diversity of student characteristics, aims, experiences, and outcomes one finds in community colleges today. It can be given to part-time as well as full-time students; to older as well as younger students; to students who aim to transfer to another college, to seek a degree, or who are pursuing training for a specific job or occupation; or to students who attend for personal, culture interests, or to work on basic skills. The basic idea behind the questionnaire is the concept of “quality of effort.” All learning requires time and effort by the learner. What students learn in college will depend to a considerable degree on the quality of effort they invest in the
60
college experience. This [effort] is measured by how much they do with respect to capitalizing on what the college offers—courses, library, writing, arts, science, faculty contacts, students acquaintances, etc. Moreover, some of these student activities require greater effort and have a greater influence on learning and development than other activities—hence, reflect quality of effort (Ethington, Guthrie, & Lehman, 2001, p. 2-3).
The CCSEQ is designed to gather information from community college students in four
areas: amount, breadth, and quality of effort in both in class and out of class experiences;
progress toward important educational outcomes; satisfaction with the community
college environment; and demographic and background characteristics (Murrell &
Glover, 1996).
The CCSEQ is comprised of the following sections: Background, Work, and
Family; College Program; College Courses; College Activities; Estimate of Gains;
College Environment; and Additional Questions. The first three sections of the CCSEQ
contain items that provide information about student demographics, college programs,
and courses taken at the college. Each section is presented below with a description of
the items in that section.
Background, Work, and Family
Items, which address the following characteristics, are contained in this section:
• age; • gender; • ethnicity; • native language; • time spent working on a job; • the effect of job responsibilities on college work; • the effect of family responsibilities on college work; and • currently in a work-study program.
Students are presented with a list of specific responses to each question and instructed to
choose one response.
61
College Program
This section contains items related to the student’s program at the college. The
topics include:
• number of credits taken during current term; • total number of credits taken at present college; • meeting times of classes; • grades at the college; • number of hours spent studying; • number of hours spent on campus (not in class); and • most important reason for attending college.
College Courses
The first part of the section contains a list of twelve general education areas.
Students are asked to indicate how many courses in each area they have taken. The areas
include college math, computer literacy, English preparation, English composition, fine
arts, foreign languages, humanities, math preparation, physical or health education,
sciences, social sciences, and speech and communications.
In the second part of this section students are asked to respond “yes” or “no” to
the following items:
• working for an AA degree; • working for an AS degree; • working for a diploma; • working for a certificate; • plan to transfer to a four-year college or university; or • currently enrolled in an occupational/vocational program.
College Activities
At the heart of the CCSEQ is the College Activities section, designed to measure
student quality of effort in the use of facilities provided by their institution and
experiences with faculty, students, and staff members. This section contains 107 items
62
that are grouped into 13 topics. Nine groups of items produce not only information about
individual activities, but also form nine Quality of Effort scales which give an indication
of the amount of effort students put into each of those areas of their college experience.
The titles of the groups of activity items are:
1. Course Activities; 2. Library Activities; 3. Faculty; 4. Student Acquaintances; 5. Art, Music, and Theater Activities; 6. Writing Activities; 7. Science Activities; 8. Career/Occupational Skills; 9. Computer Technology; 10. Clubs and Organizations; 11. Athletic Activities; 12. Counseling and Career Planning; and 13. Learning and Study Skills. Each item represents a specific activity and the students are asked to report how
often they have engaged in the activity during the current school year. For the first 12
groups of activity items, students answer by selecting one of the following categories: (1)
never; (2) occasionally; (3) often; or (4) very often. The Learning and Study Skills group
of College Activities items has a different response format than the other twelve groups
of items. The Learning and Study Skills items are a list of nine skills and students are
asked to indicate whether they have received (1) none, (2) some, or (3) a lot of instruction
in each of these skills. Examples of items from the first 12 item groups are shown in
Table 3.1 College Activity Topics: Examples of Items in Each Group (Ethington,
Guthrie, & Lehman, 2001). For a complete display of the College Activity items, see a
copy of the CCSEQ in Appendix A.
63
Table 3.1 College Activity Topics: Examples of Items in Each Group
Topic Examples of items Course Activities Participated in class discussions. Summarized major points and information from reading and notes. Library Activities Used the library as a quiet place to read or study material you brought with you. Prepared a bibliography or set of references for a term paper or report. Faculty Asked an instructor for information about grades, make-up work, assignments,
etc. Discussed your career and/or educational plans, interests, and ambitions with an
instructor. Student Had serious discussions with students who were much older or much younger Acquaintances than you. Had serious discussions with students whose philosophy of life or personal
values were very different from yours. Art, Music, and Talked about art (painting, sculpture, architecture, artists, etc.) with other Theater Activities students at the college. Writing Activities Used a dictionary [or computer (word processor) spell check/thesaurus] to look
up the proper meaning, definition, and/or spelling of words. Asked other people to read something you wrote to see if it was clear to them. Science Activities Memorized formulas, definitions, technical terms. Talked about social and ethical issues related to science and technology such as
energy, pollution, chemicals, genetics, etc. Career/Occupational Read about how to perform a procedure (occupational task, vocational skill). Skills Diagnosed a problem and carried out the appropriate procedure without having
to consult any resource. Computer Used E-mail to communicate with an instructor or other students about a course. Technology Used a computer to analyze data for a class project. Counseling and Talked with a counselor/advisor about courses to take, requirements, educational Career Planning plans. Have taken interest inventories or surveys (e.g. Strong-Campbell Interest
Inventory, Kuder Occupational Interest Survey, etc.) to help you direct career goals.
Clubs and Looked for notices about campus events and student organizations. Organizations Attended a meeting of a student club or organization. Athletic Activities Followed a regular schedule or exercise program on campus. Attended an athletic event on the campus.
64
Estimate of Gains
The Estimate of Gains section of the CCSEQ measures students’ self reported
progress in six areas: Career Preparation; Arts; Communication Skills; Mathematics,
Science, and Technology; Personal and Social Development; and Perspectives of the
World. Students are asked to report how much they have gained or made progress
toward a series of 25 important educational goals. These goals range from “acquiring
knowledge and skills applicable to a specific job or type of work” to “writing clearly and
effectively” to “becoming clearer about your own values and ethical standards.” Students
indicate their progress toward each goal by selecting (1) very little (2) some, (3) quite a
bit, or (4) very much.
College Environment
There are eight items in the College Environment section. The first item asks if
the student would choose to attend the same college again. The next five questions ask
students to choose (1) all, (2) most, (3) some, or (4) few or none to indicate the degree to
which they find that:
• students are friendly and supportive of each other; • instructors are approachable, helpful, and supportive; • counselors, advisors, and support staff are helpful, considerate, and
knowledgeable; • courses are challenging, stimulating, and worthwhile; and • the college is a stimulating and exciting place to be.
The last two questions ask if there are sufficient places to meet and study with other
students and if there are places on campus to use computer technology.
65
Additional Questions
There is a place on the last page of the questionnaire for 20 locally developed
questions. Colleges and researchers may opt to ask students about aspects of the college
experience which are not covered elsewhere in the instrument. Another use of these
Additional Questions is to use one or more of them to identify groups of students for the
purpose of analyzing data in different ways.
Quality of Effort Scales
Quality of effort is defined as “the amount, scope, and quality of effort students
put into taking advantage of the opportunities offered to them by the college” (Pace,
1984). This construct is measured in the CCSEQ by determining how often (during the
current school year) students engage in a variety of activities related to the use of campus
facilities (e.g., classrooms, libraries, science labs, art exhibits) and other opportunities to
increase their academic and social development.
The items that measure Quality of Effort are the College Activities items. The
nine Quality of Effort scales grouped according to topic are: Course Activities; Library
Activities; Faculty, Students Acquaintances; Art, Music, and Theater Activities; Writing
Activities; Science Activities; Career/Occupation Skills; and Computer Technology.
Each scale is formed by adding the separate scores for each item in a group together in
the following manner. If a student answers “never” to an item he/she receives one point
for that item. An answer of “occasionally” gets two points; “often” three points; and
“very often” four points. The points for all items in a group are then added together, and
the result is a scale score for that group item. If any item within a scale is omitted by a
student, then a scale score will not be computed for that student for that particular scale.
66
Since there are different numbers of items in the various Quality of Effort scales,
the ranges of the scales differ. For example, the Course Activities scale is made up of 10
items, and the range for that scale is 10-40. To understand the range you can imagine
that if an individual answered “never” to all 10 items, that person would receive one point
for each item and have a total score of 10. On the other hand, if another person answered
“very often” to all 10 items, he/she would receive four points for each item and have a
total score of 40. Therefore, the extreme scores on a ten item scale are 10 and 40. The
higher the score reported on a scale, the greater the degree of involvement on that scale.
A list of the Quality of Effort scales is presented in Table 3.2.
Table 3.2 Quality of Effort Scales
Scale Number of Items Scale Range
Course Activities 10 10-40
Library Activities 7 7-28
Faculty 9 9-36
Student Acquaintances 6 6-24
Art, Music, and Theater 9 9-36
Writing Activities 8 8-32
Science Activities 11 11-44
Career/Occupational Skills 9 9-36
Computer Technology 8 8-32
One advantage of having scale scores to represent the Quality of Effort students
put into specific areas of their college experience is that the scale scores for groups of
students can be added together and means computed which then represent Quality of
67
Effort of the group. The college experience can be investigated for groups of students
(by program, by gender, by ethnicity, by enrollment status, etc.) and efforts made to
determine why some groups of students seem to be more involved than others. This
analysis can result in the improvement of delivery of services by the faculty and college
(Ethington, Guthrie, & Lehman, 2001). Murrell & Glover (1996) provided the following
summation:
Information provided by the CCSEQ gives community college administrators and faculty a blueprint for operationalizing theoretical concepts of student involvement and engagement. It places responsibility for learning on students and holds them accountable for their utilization of the programs and facilities provided by the institution. It provides valuable information about the interactive processes between students and the institution that is vital if we are to put together the puzzle of institutional impact to enhance the social, academic, and career development of community college students (p. 200).
Procedures for Data Analysis
The Community College Student Experiences Questionnaire (CCSEQ) was used
as a matched pretest and posttest to collect data on the academic and social integration of
the research participants. The independent variables were the control and experimental
groups. The dependent variables were academic integration and social integration as
measured by 17 indicators on the CCSEQ. Nine of the indicators related to academic
integration and eight of the indicators dealt with social integration. The nine indicators of
the CCSEQ that were used to figure the patterns of academic integration in this study
were the following:
1) GPA; 2) Time spent studying or preparing for classes; 3) Course activities (10-item scale); 4) Library activities (7-item scale); 5) Learning and study skills (9-item scale); 6) Writing activities (8-item scale);
68
7) Science activities (11-item scale); 8) Computer technology (8-item scale); 9) Experiences with faculty (9-item scale).
The eight indicators of the CCSEQ that were used to figure the patterns of social
integration were the following:
1) Student acquaintances (6-item scale); 2) Art, music, theater (9-item scale); 3) Hours spent on campus; 4) Clubs and organizations (7-item scale); 5) Athletics (6-item scale); 6) Counseling and career planning activities (8-item scale); 7) College environment (8-item scale); and 8) Estimate of gain (25-item scale).
These 17 indicators that were used are suggested by the Tinto model or
validations of the Tinto model (Pascarella & Terenzini, 1979, 1983; Terenzini &
Pascarella 1977, 1980) and by other researchers (Douzenis, 1994; Moss & Young, 1995).
Since none of these researchers had statistically verified that these indicators are
moderately or strongly correlated to each other and therefore actually do measure
academic and social integration, factor analyses were conducted. After conducting the
factor analyses on the pretests data, a different combination of indicators made up the
academic and social integration dependant variables. The factor analyses data are
discussed in Chapter 4 Results.
Testing the Research Question
The data for the study were collected and analyzed in response to the following
research question outlined in Chapter 1 of this study: Does computer-mediated
communication have an impact on the academic and social integration of community
college students as measured on the CCSEQ? The research was conducted with a
69
pretest-posttest control-group experimental design with matched pretests and posttests.
The first step in analyzing data from a pretest-posttest control-group experiment was to
compute descriptive statistics on all of the variables measured by the CCSEQ. Mean
scores were computed for the pretest and posttest scores for the experimental and control
groups. The second step was to conduct a factor analysis on the nine indicators of the
CCSEQ that were used to figure the patterns of academic integration and on the eight
indicators of the CCSEQ that were used to figure the patterns of social integration to
make sure that these variables were moderately or highly correlated with each other and
therefore measured academic and social integration.
The method of statistical analysis that was used to analyze the data for the control
and experimental groups was analysis of covariance (ANCOVA) to determine whether
the control and experimental groups differed on academic and social integration. The
posttest mean of the experimental group was compared with the posttest mean of the
control group with the pretest scores used as a covariate. The raw data produced from the
responses on the CCSEQ were entered into and analyzed by the Statistical Package for
the Social Sciences (SPSS). The SPSS program was used to compile and analyze the
descriptive and inferential statistics.
Expected Results
The analysis of the data gathered through the administration of the Community
College Student Experiences Questionnaire was expected to address the posed research
question in the following way:
Does computer-mediated communication have an impact on the academic and
social integration of community college students as measured by the CCSEQ?
70
The study hypothesized that data analysis will show that there will be no
difference in the academic and social integration of the control and experimental groups
as measured by the CCSEQ.
71
CHAPTER 4
RESULTS
This chapter discusses the results of the research. The research sample and
descriptive statistics are discussed. Also, the factor analyses of the academic and social
dependent variables are presented and results discussed. Finally, the analysis of
covariance for academic and social integration is presented with a summary of the
findings.
Research Sample and Descriptive Statistics
The following research question was presented in Chapter 3 of this dissertation to
be examined for veracity at the end of the study: Does computer-mediated
communication have an impact on the academic and social integration of community
college students as measured by the CCSEQ? Community college students enrolled in
the researcher’s Chemistry 1406 course at a North Central Texas community college
made up the sample for this study. The research participants were randomly assigned
with a table of random numbers to the control and experimental groups resulting in N=63
in the control group and N=62 in the experimental group. After accounting for attrition
to the research project, N=47 in the control group and N=41 in the experimental group.
The results of the descriptive statistics for the CCSEQ Background, Work,
Family, and College program illustrated that random assignment does not ensure initial
equivalence between groups. Random assignment only ensures absence of systematic
bias in group composition. The age of the research participants for the control and
72
experimental groups ranged from 18-19 or younger to 40-55. The biggest differences in
the two groups were for the 20-22, 23-27, and 28-39 age categories. The control group
contained 19.1 % age 20-22 and the experimental group 24.4 %. The control group
contained 12.8 % age 23-27 and the experimental group 17.1 %. And the control group
contained 27.7 % age 28-39 and the experimental group 17.1 %.
The gender distribution was fairly consistent between the two groups. The
control group contained 12.8 % male and 87.2 % female. The experimental group
contained 17.1 % male and 82.9 % female. The racial or ethnic identification of the
research participants included Asian or Pacific Island, Black (African-American),
Hispanic (Latino), and White. The Asian participants made up 6.4 % of the control group
and 9.8 % of the experimental group. The control group contained more Black
participants (29.8 %) than the experimental group (12.2 %). The Hispanic distribution
was consistent at 14.9 % for the control group and 17.1 % for the experimental group.
Finally, the control group contained 48.9 % White and the experimental group 61.0 %
White.
The control and experimental groups were consistent on the English language
(Native or Non-native) responses. The control group contained 72.3 % Native and 27.7
% Non-native. And the experimental group contained 78.0 % Native and 22.0 % Non-
native. The responses to the time spent working on a job during the college session
question was also similar for the control and experimental groups. The control group
participants reported a total of 29.8 % for “no job or 1-10 hours of work” with the
experimental group totaling 34.2 %. The control group totaled 42.5 % for “11-30 hours
of work” with the experimental group reporting a total of 46.4 %. Finally, the control
73
group totaled 27.6 % for “31 to more than 40 hours of work” and the experimental group
reported a total of 19.5 %. The responses to the effect of job on school work question
were also similar for the control and experimental groups. The control group reported a
total of 48.9 % for “no job or job does not interfere with school work” with the
experimental group at 53.7 %. The control group reported a total of 51.0 % for the “job
takes some or a lot of time from school work” with the experimental group at 46.4 %.
The distribution of responses to the “effect of family responsibilities on college
work” was also similar for the control and experimental groups. The control group
reported 51.1% for “no family responsibility or family does not interfere” with the
experimental group reporting 56.1 %. The control group reported a total of 49.0% for
“family takes some or a lot of time from school work” with the experimental group at
43.9 %. The majority of research participants for both the control and experimental
groups did not participate in a Work-study program. Only 2.1 % of the control group
participants and 2.4 % of the experimental group participants were part of a Work-study
program.
Many of the research participants were part-time students taking less than 12
hours of coursework during the semester. The control group reported a total of 44.7 %
for less than 12 hours of coursework taken this semester with the experimental group at
53.7 %. The remaining research participants were full-time students taking more than 12
hours of coursework during the semester. The control group reported a total of 55.3 %
for 12 or more hours with the experimental group at 46.3 %.
Since the research participants were enrolled in a day time freshman level
(introductory) chemistry course, most of the students were “day only” with less than 46
74
total course credits at this community college. The control group reported 85.1 % “day
only” and the experimental group 78.0 %. Also, the control group reported a total of 91.5
% with less than 46 total course credits with the experimental group at 78.0 %. The
majority of the students gave themselves a self-reported GPA of “A” or “B.” The control
group reported a total of 78.7 % “A or B” and the experimental group reported 75.7 % “A
or B.”
The majority of research participants reported that they study 1 to 10 hours per
week with the control at 76.6 % and the experimental group at 70.7 %. The remainder of
these groups reported studying 11 to more than 20 hours per week with the control group
at 23.4 % and the experimental group at 29.3 %. Most of these students spent very little
time on campus outside of class per week. The control group reported 91.5 % for “6 or
fewer hours on campus NOT in class per week” with the experimental group reporting
95.1 %. The remainder of the students reported 7 to more than 12 hours on campus with
the control group at 8.5 % and the experimental group at only 4.9 %. The most important
reasons for attending this college were overwhelmingly to prepare to transfer to a four-
year college or university or to gain skills for a new job or occupation. The control group
reported 51.1 % for “prepare to transfer to a four-year college or university” with the
experimental group at 46.3 %. And the control group reported 42.6 % for “gain skills for
new job or occupation” with the experimental group reporting 48.8 %. A small
percentage of responses was given by both groups for “gain skills to retrain, remain
current, or advance in current job or occupation” with 4.3 % (control) and 4.9 %
(experimental). These results are summarized in Table 4.1 CCSEQ Background, Work,
75
Family, and College Program Items. Appendix D contains the remainder of the CCSEQ
pretest and posttest descriptive statistics for the control and experimental groups.
76
Table 4.1 CCSEQ Background, Work, Family, and College Program Items Item Percentage (%) Age 18-19 or younger 20-22 23-27 28-39 40-55 Over 55
Control
34.0 19.1 12.8 27.7 6.4 0.0
Experimental
34.1 24.4 17.1 17.1 7.3 0.0
Gender Male Female
12.8 87.2
17.1 82.9
Racial or Ethnic Identification Native American Asian or Pacific Islander Black, African-American Hispanic, Latino White Other: What? _________
0.0 6.4 29.8 14.9 48.9 0.0
0.0 9.8 12.2 17.1 61.0 0.0
English language Native Non-native
72.3 27.7
78.0 22.0
Time spent working on job during college session None, no job 1-10 hours 11-20 hours 21-30 hours 31-40 hours more than 40 hours
25.5 4.3 10.6 31.9 19.1 8.5
17.1 17.1 22.0 24.4 12.2 7.3
Effect of job on school work No job Job does not interfere Job takes some time from school work Job takes a lot of time from school work
25.5 23.4 48.9 2.1
17.1 36.6 41.5 4.9
Effect of family responsibilities on college work No family responsibility Family does not interfere Family takes some time from school work Family takes a lot of time from school work
21.3 29.8 36.2 12.8
22.0 34.1 34.1 9.8
Work-study program Yes No
2.1 97.9
2.4 97.6
77
Credits taken THIS term Less than 6 6 to 8 9 to 11 12 to 15 more than 15
6.4 17.0 21.3 48.9 6.4
7.3 22.0 24.4 34.1 12.2
Total number of course credits taken at this college 1-15 credits 16-30 credits 31-45 credits 46 or more credits
29.8 36.2 25.5 8.5
24.4 39.0 14.6 22.0
When classes meet Day only Evening only Some day and some evening
85.1 0.0 14.9
78.0 0.0 22.0
College grades A A-, B+ B B-, C+ C, C- Lower than C- No grades, this is my first term
14.9 46.8 17.0 4.3 0.0 0.0 17.0
29.3 24.4 22.0 9.8 0.0 2.4 12.2
Time spent studying per week 1 to 5 hours 6 to 10 hours 11 to 15 hours 16 to 20 hours more than 20 hours
44.7 31.9 14.9 6.4 2.1
26.8 43.9 14.6 14.6 0.0
Time on campus NOT in class per week None 1 to 3 hours 4 to 6 hours 7 to 9 hours 10 to 12 hours more than 12 hours
36.2 36.2 19.1 0.0 4.3 4.3
34.1 41.5 19.5 2.4 2.4 0.0
Most important reason for attending THIS COLLEGE Prepare to transfer to a four-year college or university Gain skills for new job or occupation Gain skills to retrain, remain current, or advance in current job or occupation Satisfy personal interest (cultural, social) Improve basic skills (English, reading, or math)
51.1 42.6
4.3 0.0 2.1
46.3 48.8
4.9 0.0 0.0
78
Factor Analyses of Pretests
The second step of analyzing data was to conduct individual and group factor
analyses on the nine indicators of the CCSEQ that were suggested to figure the patterns
of academic integration and on the eight indicators of the CCSEQ that were suggested to
figure the patterns of social integration. These factor analyses provided an empirical
basis for making sure that the academic and social integration indicators were moderately
or highly correlated with each other and therefore could be used to calculate an academic
and social integration score. Correlation coefficients with values of .5 or greater showed
a “moderate” to “strong” relationship and were therefore used as the basis for the
interpretation of the factor analyses conducted.
Academic Integration Factor Analyses
For the academic integration scales, the one-factor solution was the most
appropriate and interpretable solution. Table 4.2 Academic Integration Scales provides
the eigenvalues, percent of variance, and factor loadings for the one-factor model. The
eigenvalues, percent of variance, and factor loadings for the following academic
integration scales provided the evidence that the items in the scale are a cohesive group of
items which measure a single construct: Course Activities, Library Activities, Learning
and Study Skills, Writing Activities, Computer Technology, and Faculty Activities. For
the Science Activities scale, item 10 (r=.459) was omitted since it was less than 0.5 and
therefore showed a “low” relationship.
79
Table 4.2 Academic Integration Scales Course
Activities Library Activities
Learning and Study Skills
Writing Activities
Science Activities
Computer Technology
Faculty Activities
Eigenvalues 4.426 3.755 6.304 5.687 6.056 3.971 5.103 % of Variance
44.264
53.648
70.044
71.093
55.059
49.633
56.704
Item 1 .628 .697 .894 .813 .739 .627 .696 2 .598 .670 .852 .880 .869 .675 .773 3 .620 .820 .905 .910 .856 .686 .788 4 .713 .723 .701 .900 .847 .708 .745 5 .598 .689 .812 .875 .800 .726 .713 6 .734 .781 .783 .808 .749 .781 .814 7 .624 .735 .921 .845 .806 .825 .753 8 .702 .811 .694 .755 .575 .828 9 .708 .832 .615 .648 10 .708 .459 11 .544
After running a factor analysis on the individual academic integration scales, a
factor analysis was conducted on the nine academic integration indicators. The
eigenvalue of 3.878, percent of variance of 43.093, and factor loadings for the academic
integration indicators provided the evidence that these items formed a cohesive group that
measure a single construct called academic integration. Only the indicators that had a
positive and moderate to strong relation were used to calculate the total academic
integration score for the research participants. Table 4.3 Academic Integration
Component Matrix summarizes the results.
80
Table 4.3 Academic Integration Component Matrix Academic Integration Indicators Component 1Grades -.549Hours Studying/Week .148Quality of Effort Course Activities .729
Quality of Effort Library Activities .710
Learning and Study Skills .395Quality of Effort Writing Activities .808
Science Academic Integration .768
Quality of Effort Computer Technology .774
Quality of Effort Faculty .723
Based on the results of the academic integration indicator factor analysis, the
academic integration dependent variable did not include all nine of the original
indicators. The academic integration score was made up of the following indicators:
Course Activities, Library Activities, Writing Activities, Science Activities (omit item
10), Computer Technology, and Faculty Activities. The following indicators were
omitted from the academic integration dependent variable: GPA, HoursStudying/Week,
and Learning and Study Skills.
A final factor analysis was run on the remaining academic integration indicators.
The eigenvalue of 3.478, percent of variance of 57.968, and factor loadings for the
academic integration indicators provided the evidence that these items form a single
construct called academic integration. Table 4.4 Academic Integration Construct
Component Matrix summarizes these results.
81
Table 4.4 Academic Integration Construct Component Matrix Academic Integration Indicators Component 1Quality of Effort Course Activities .727
Quality of Effort Library Activities .729
Quality of Effort Writing Activities .803
Science Academic Integration (omit item 10) .779
Quality of Effort Computer Technology .786
Quality of Effort Faculty .740
Social Integration Factor Analyses
For the social integration scales, the one-factor solution was the most appropriate
and interpretable solution. Table 4.5 Social Integration Scales provides the eigenvalues,
percent of variance and factor loadings for the one-factor model. The eigenvalues,
percent of variance, and factor loadings for the following social integration scales
provided the evidence that the items in the scale are a cohesive group of items which
measure a single construct: Student Acquaintances, Clubs and Organizations, and
Counseling and Career Planning. For the Art, Music, and Theater Activities scale, item 2
(r=.442) and item 4 (r=.403) were omitted since they had r values less than 0.5 and
therefore showed a “low” relationship. For the Athletic Activities scale, item 5 (r=.362)
was omitted since it had an r value less than 0.5 and therefore showed a “low”
relationship. For the College Environment scale, item 1 (r=.371), item 5 (r=.489), and
item 8 (r=.421) were omitted since they had r values less than 0.5 and therefore showed a
“low” relationship. For the Estimate of Gains scale, item 5 (r=.496), item 14 (r=.485),
82
and item 18 (r=.468) were omitted since they had r values less than 0.5 and therefore
showed a “low” relationship.
Table 4.5 Social Integration Scales Student
Acquaint. Art, Music, Theater
Clubs and Organ.
Athletic Act.
Counsel and Career Planning
College Environ
Est. of Gains
Eigenval. 4.133 4.042 5.078 2.443 4.577 2.414 11.674 % of Variance
68.877
44.910
72.541
40.725
57.211
30.176
46.697
Item 1 .726 .521 .791 .741 .768 .371 .503 2 .887 .442 .795 .690 .794 .562 .758 3 .917 .782 .871 .699 .832 .662 .647 4 .784 .403 .831 .566 .749 .633 .609 5 .833 .764 .933 .362 .834 .489 .496 6 .818 .711 .944 .692 .755 .667 .567 7 .618 .780 .702 .512 .626 8 .863 .587 .421 .631 9 .765 .741 10 .691 11 .771 12 .769 13 .835 14 .485 15 .725 16 .805 17 .763 18 .468 19 .743 20 .624 21 .625 22 .656 23 .818 24 .703 25 .809
After running a factor analysis on the individual social integration scales, a factor
analysis was conducted on the eight social integration indicators. The eigenvalue of
2.840, percent of variance of 35.504, and factor loadings for the social integration
indicators provided the evidence that these items form a cohesive group that measure a
83
single construct called social integration. Only the indicators that had a positive and
“moderate” to “strong” relation were used to calculate the total social integration score
for the research participants. Table 4.6 Social Integration Component Matrix summarizes
these results.
Table 4.6 Social Integration Component Matrix Social Integration Indicators Component 1Quality of Effort Student Acquaintances .491
Social Integration Art Music Theater .696
Hours on Campus/Week .361Clubs and Organizations .647
Social Integration Athletics .553
Counseling and Career Planning .786
Social Integration College Environment .275
Social Integration Estimate of Gains .752
Based on the results of the social integration factor analysis, the social integration
dependent variable did not include all eight of the original indicators. The social
integration dependent variable was made up of the following indicators that had a
positive r with a .5 or greater value: Art, Music, and Theater Activities (omit items 2 and
4); Clubs and Organizations; Athletic Activities (omit 5); Counseling and Career
Planning; and Estimate of Gains (omit items 5, 14, and 18). The following indicators
were omitted because they had a negative r or a value less than 0.5: Student
Acquaintances, Time Spent on Campus, and College Environment.
84
A final factor analysis was run on the remaining social integration indicators. The
eigenvalue of 2.512, percent of variance of 50.237, and factor loadings for the social
integration indicators provided the evidence that the items formed a cohesive group of
items that measure a single construct called social integration. Table 4.7 Social
Integration Construct Component matrix summarizes the results.
Table 4.7 Social Integration Construct Component Matrix Social Integration Indicators Component 1 Social Integration Art Music Theater (omit items 2 and 4) .691
Clubs and Organizations .692
Social Integration Athletics (omit item 5) .615
Counseling and Career Planning .805
Social Integration Estimate of Gains (omit items 5, 14, and 18) .728
After the academic and social integration dependent variables were determined
through factor analyses, academic and social integration scores were calculated for the
pretests and posttests of the research participants in the control and experimental groups.
These academic and social integration scores were then summed and a mean calculated
for the control and experimental groups on their pretests and posttests. The control group
academic integration mean on the pretest was 105.64 and on the posttest 118.64. The
experimental group academic integration mean on the pretest was 112.39 and on the
posttest 115.93. The control group social integration mean on the pretest was 92.19 and
on the posttest 102.51. The experimental group social integration mean on the pretest
was 94.34 and on the posttest 90.93. Table 4.8 Academic and Social Integration
85
Descriptive Statistics summarizes these results.
Table 4.8 Academic and Social Integration Descriptive Statistics Pretest/Posttest Control and Experimental
Academic Integration
Social Integration
Pretest Control Mean 105.64 92.19 N 47 47 Std.
Deviation 29.104 23.856
Posttest Control Mean 118.64 102.51 N 47 47 Std.
Deviation 25.831 23.261
Pretest Experimental Mean 112.39 94.34 N 41 41 Std.
Deviation 24.803 23.868
Posttest Experimental Mean 115.93 90.93 N 41 41 Std.
Deviation 23.860 24.368
Total Mean 113.08 95.15 N 176 176 Std.
Deviation 26.343 24.062
Analysis of Covariance (ANCOVA)
Campbell and Stanley (1981) observed that researchers often use the wrong
statistical procedure to analyze data for pretest-posttest control-group experimental
designs. It would have been incorrect to do a t test comparing the pretest and posttest
means of the experimental group and another t test comparing the corresponding means
of the control group. The preferred statistical methods was analysis of covariance,
(ANCOVA), in which the posttest mean of the experimental group was compared to the
posttest mean of the control group with the pretest scores used as a covariate.
86
Analysis of covariance (ANCOVA) is used as a procedure for the statistical
control of an extraneous variable. ANCOVA controls for the effects of this extraneous
variable, called a covariate, by partitioning out the variation attributed to this additional
variable. In other words, ANCOVA allows the researcher to “adjust out” the effects of a
confounding variable prior to running an analysis of variance comparing two or more
groups. The covariate used in an ANCOVA analysis is some continuous (intervally-
scaled) variable that the researcher feels is appreciably correlated with the dependent
variable and which serves to distort the nature of the relationship between the
independent variables (control and experimental groups) and the dependent variable
(academic or social integration). In essence, the covariate adjustment is accomplished by
correlating the covariate with the dependent variable. The proportion of the dependent
variable variance that can be explained by the covariate is removed form the analysis, and
the analysis of variance (ANOVA) is conducted on the remaining (residual) variance.
Using ANCOVA, the researcher can increase the precision of the research by partitioning
out the variation attributed to the covariate, which results in a smaller error variance and a
better investigation of the effects on the primary independent variables (Hinkle, Wiersma,
& Jurs, 1998).
There are three important assumptions underlying the ANCOVA statistical
analysis. The first assumption is the absence of any effect of the independent variable on
the covariate which can be addressed by assuring that the covariate is collected prior to
initiation of the study. The covariate, pretest, for this study was collected before the
study began so this assumption is satisfied. The second assumption is the “homogeneity
of regression.” This assumption is based on the sameness of the relationship between the
87
dependent variable and the covariate across levels of the independent variable. One
commonly used test for this purpose, the Levene test, tests for the equality of the error
variances across groups. The Levene test tests the null hypothesis that the dependent
variable variance remaining after the dependent variable is correlated with the covariate
is equivalent across groups. The Levene test should be run routinely prior to conducting
an ANCOVA. This statistical test is one case where it is appropriate not to reject the null
hypothesis. The third assumption for the ANCOVA procedure is the presence of a linear
relationship between the covariate and the dependent variable which can be addressed by
interpreting the results of the ANOVA on the covariate. (Hinkle, Wiersma, & Jurs,
1998).
Academic Integration ANCOVA
The first step was to test the remaining two important assumptions for ANCOVA:
“homogeneity of regression” and presence of a linear relationship between the covariate
and the dependent variable. The Levene’s Test of Equality of Error Variances yielded a
P calculated (.736) > p critical (0.05) and F calculated (.114) < F critical (3.92) so fail to reject the
null hypothesis stating that the error variance of the dependent variable is equal across
groups. Therefore, this assumption was satisfied and the ANCOVA procedure could
continue. The ANOVA results using the pretest as a covariate revealed statistical
significance [p calculated (.000) < p critical (0.05) and F calculated 80.766 > F critical (3.92)] with
an eta2 of .49 (49%) showing that the null hypothesis of no relationship between the
covariate (pretest) and the dependent variable (academic integration) is rejected. The
groups were different at the beginning of the study and therefore the pretest variable can
88
adjust for these differences as an appropriate covariate. All three assumptions that are
important for ANCOVA have been satisfied for academic integration.
The second step was to use the ANCOVA to compute the test statistic for
academic integration. The null hypothesis was that there was no difference in the
academic integration of the control and experimental groups. The ANCOVA results
using the pretest as a covariate revealed no statistical significance [p calculated (.837) > p
critical (0.05) and F calculated (.043) < F critical 3.92] with an eta2 of .001 (.1%) showing that
the null hypothesis is not rejected. There is no difference in the academic integration of
the control and experimental groups. The ANCOVA was then analyzed again using the
Cook’s distance as a diagnostic tool to remove outlier data. Once again, the ANCOVA
results using the pretest as a covariate revealed no statistical significance [p calculated (.474)
> p critical (0.05) and F calculated (.571) < F critical 3.92] with an eta2 of .006 (.6%) showing
that the null hypothesis is not rejected. There is no difference in the academic integration
of the control and experimental groups. These statistics just described are summarized
in Table 4.9 Academic Integration ANCOVA.
Table 4.9 Academic Integration ANCOVA
DF p
p critical = .05 F
F critical = 3.92
Eta Squared Levene’s Test of Equality of Error Variance
1, 86
.736
.114
NA Pretest Covariate ANOVA
1, 85
.000
80.766
.49 (49%)
ANCOVA with Pretest as Covariate
1, 85
.837
.043
.001 (0.1%)
ANCOVA with Pretest as Covariate (Outliers removed)
1, 85
.474
.517
.006 (0.6%)
89
Social Integration ANCOVA
The first step was to test the remaining two important assumptions for ANCOVA:
“homogeneity of regression” and presence of a linear relationship between the covariate
and the dependent variable. The Levene’s Test of Equality of Error Variances yielded a
P calculated (.560) > p critical (0.05) and F calculated (.342) < F critical (3.92) so fail to reject the
null hypothesis stating that the error variance of the dependent variable is equal across
groups. Therefore, this assumption was satisfied and the ANCOVA procedure could
continue. The ANOVA results using the pretest as a covariate revealed statistical
significance [p calculated (.000) < p critical (0.05) and F calculated 116.832 > F critical (3.92)] with
an eta2 of .582 (58.2%) showing that the null hypothesis of no relationship between the
covariate (pretest) and the dependent variable (social integration) is rejected. The groups
were different at the beginning of the study and therefore the pretest variable can adjust
for these differences as a appropriate covariate. All three assumptions that are important
for ANCOVA have been satisfied for social integration.
The second step was to use the ANCOVA to compute the test statistic for social
integration. The null hypothesis is that there is no difference in the social integration of
the control and experimental groups. The ANCOVA results using the pretest as a
covariate revealed no statistical significance [p calculated (.760) > p critical (0.05) and F calculated
(.094) < F critical 3.92] with an eta2 of .001 (.1%) showing that the null hypothesis is not
rejected. There is no difference in the social integration of the control and experimental
groups. The ANCOVA was then analyzed again using the Cook’s distance as a
diagnostic tool to remove outlier data. Once again, the ANCOVA results using the
pretest as a covariate revealed no statistical significance [p calculated (.496) > p critical (0.05)
90
and F calculated (.490) < F critical 3.92] with an eta2 of .006 (.6%) showing that the null
hypothesis is not rejected. These statistics just described are summarized in Table 4.10
Social Integration ANCOVA.
Table 4.10 Social Integration ANCOVA
DF p
p critical = .05 F
F critical = 3.92
Eta Squared Levene’s Test of Equality of Error Variance
1, 86
.560
.342
NA Pretest Covariate ANOVA
1, 85
.000
116.823
.582 (58.2%)
ANCOVA with Pretest as Covariate
1, 85
.760
.094
.001 (0.1%) ANCOVA with Pretest as Covariate (Outliers removed)
1, 85
.486
.490
.006 (0.6%)
91
Summary of Findings
After an in-depth analysis of data using descriptive statistics, factor analysis, and
ANCOVA, the finding of this study was that there is no statistically significant difference
between the control and experimental groups on their academic and social integration as
measured by the CCSEQ. In other words, CMC did not have a positive or negative
impact on the integration of community college students as measured by the CCSEQ.
Table 4.11 Academic and social integration ANCOVA summarizes the data.
Table 4.11 Academic and social integration ANCOVA
Integration
DF
p p critical = .05
F F critical =
3.92
Eta Squared
Academic 1, 86 .736 .114 NA Levene’s Test of Equality of Error Variance Social 1, 86 .560 .342 NA
Academic 1, 85 .000 80.766 .49 (49%) Pretest Covariate ANOVA Social 1, 85 .000 116.823 .582 (58.2%)
Academic 1, 85 .837 .043 .001 (0.1%) ANCOVA with Pretest as Covariate Social 1, 85 .760 .094 .001 (0.1%)
Academic 1, 85 .474 .517 .006 (0.6%) ANCOVA with Pretest as Covariate (Outliers removed) Social 1, 85 .486 .490 .006 (0.6%)
92
CHAPTER 5
CONCLUSIONS
Chapter 5 interprets and discusses the results discussed in Chapter 4 Results. This
chapter first discusses the general findings that include a discussion of the methodology
and sample. The other sections in this chapter include interpretation based on statistical
procedures, interpretation related to previous research, limitations, implications, and
finally conclusions.
Introduction
Although research findings to date have documented that computer-mediated
communication (CMC) gets students involved, a substantial gap remained in determining
the impact of computer-mediated communication on academic and social integration of
community college students. Because computer technology, specifically computer-
mediated communication, has proliferated within teaching and learning in higher
education and because of the importance of academic and social integration, this study
was significant in documenting through quantitative data analysis the impact that
computer-mediated communication had on the academic and social integration of
community college students. The overall approach was to conduct a pretest-posttest
control-group experimental study using computer-mediated communication as the
experimental treatment. The Community College Student Experiences Questionnaire
(CCSEQ) was given to collect data that were used to measure the academic and social
integration of the control and experimental groups. The pretest and posttest data
93
collected for these groups were analyzed to see if the computer-mediated communication
treatment had an impact on the integration of community college students. The study
hypothesized that data analysis will show that there will be no difference in the academic
integration and social integration reported by the control and experimental groups.
General Findings
The following research question was addressed in this study: Does computer-
mediated communication have an impact on the academic and social integration of
community college students as measured by the CCSEQ? After an in-depth analysis of
data using descriptive statistics, factor analysis, and ANCOVA, the finding of this study
was that there is no statistically significant difference between the control and
experimental groups on their academic and social integration as measured by the
CCSEQ. In other words, CMC did not have a positive or negative impact on the
integration of community college students as measured by the CCSEQ.
Methodology
The experimental design of this study kept the experiences of the control and
experimental groups as identical as possible, except that the experimental group was
exposed to computer-mediated communication. CMC was facilitated by the Blackboard
Learning System. To encourage active participation in this study, the researcher assigned
a research project grade that accounted for 10 percent of the overall course average.
To facilitate academic integration the researcher used CMC in three ways. First,
weekly emails were sent by the researcher that encouraged students to actively participate
in the overall course and the CMC project. Also, student-student and student-instructor
emails were encouraged to address any course content or concerns. Student responses to
94
the CCSEQ item “Used E-mail to communicate with an instructor or other students about
a course,” showed that student use of email increased during the semester. The responses
for “Often” or “Very Often” increased from a pretest average of 26.8 % to a posttest
average of 70.7 %. Also, the student responses to this item for “Never” decreased from
43.9 % on the pretest to 4.9 % on the posttest.
Second, weekly discussion board assignments directly related to course topics
were posted and student responses were mandatory for the project grade. As the study
progressed, the researcher noticed that many of the posted responses to the discussion
board assignments appeared to be copied and pasted work from other students postings.
Also, many students were not posting to the discussion board assignments on a weekly
basis.
Third, the researcher hosted two weekly chat sessions during the evening directly
related to major exam reviews. Chat attendance fluctuated throughout the semester.
During the study, student feedback about scheduling conflicts and the lack of a chat
attendance guideline at the beginning of the study made this aspect of the CMC project
difficult to enforce.
To facilitate social integration the researcher used CMC in two ways. First, the
researcher designated a Student Forum Section on the Discussion Board for questions,
concerns, and suggestions. Second, the researcher encouraged students to host chats at
other times besides the researcher hosted chat times. During the 16 week semester, 20
messages were posted to the Student Forum by 12 of the 41 experimental group
participants which suggested limited engagement. After checking the chat archives, the
researcher observed that several students logged onto the chat at sporadic times
95
throughout the study but most often no other students were logged on at the same time
which could have discouraged participation.
After analyzing the descriptive statistics of the research participants, the
researcher observed several areas that could have affected active participation in CMC.
In response to the CCSEQ questions related to time spent working on a job during the
college session, 65.9 % of the experimental group responded that they worked 11 to more
than 40 hours per week. Also, 46.4 % of the experimental group reported that their job
takes some or a lot of time from school work and 43.9 % reported that their family
responsibilities take some or a lot of time from school work.
Moreover, 70.7 % of the experimental group reported that they study 1 to 10
hours per week, and the remainder of the group (29.3 %) reported studying 11 to more
than 20 hours per week. Because most of these community college students had various
time constraints and minimal weekly study time during the semester, limited use of CMC
was observed and could have affected the results of this study.
After conducting this study, the researcher suggests the following to encourage
active participation in CMC in future studies: First, require students to weekly post their
responses to the discussion board assignments instead of allowing them to post responses
to all of the assignments at the end of the semester. Second, require students to attend at
least seven chat sessions during the semester that are hosted by the instructor, other
students, or themselves. Third, require students to post concerns, suggestions, and
questions to the Discussion Board Student Forum so that all students have access to this
information instead of just emailing these items to the instructor. Different results for
96
this study might have been obtained if more stringent CMC guidelines that encouraged
active participation had been enforced at the beginning of the study.
Sample
The total sample for this study consisted of 88 community college students
enrolled in the researcher’s Chemistry 1406 course at a North Central Texas community
college. The control group contained 47 students. And the experimental group
contained 41 students. To increase sample size the experimental group consisted of
students from a cross section of three different lectures and five different labs. Therefore,
participants did not have face to face interactions with all of the CMC experimental group
members to accompany their computer interactions. Because of this lack of face to face
interaction, a cohesive CMC group was not formed which could have impacted the
results.
Many of the research participants were considered part-time students taking less
than 12 hours of coursework during the semester. The control group reported a total of
44.7 % for less than 12 hours of coursework taken this semester with the experimental
group at 53.7 %. The rest of the research participants were full-time students taking more
than 12 hours of coursework during the semester. The control group reported a total of
55.3 % for 12 or more hours with the experimental group at 46.3 %. The research
participants were enrolled in a day time freshman level (introductory) chemistry course.
The control group reported 85.1 % “day only” and the experimental group 78.0 %. Also,
the control group reported a total of 91.5 % with less than 46 total course credits with the
experimental group at 78.0 %. Basically, half of the students in this sample attended
97
part-time and three fourths attended during the day time only. Different results for this
study could be obtained by a different group of students (e.g., more full-time vs. part-time
or more evening vs. day students).
Interpretation Based on Statistical Procedures
One area of concern in any quantitative study is the validity and reliability of the
instrument specified for gathering data. The instrument used in this study was the
Community College Student Experiences Questionnaire which has been adequately
tested for validity and reliability since 1991. Although the instrument does not contain a
direct measure of academic and social integration, researchers have used certain parts of
the instrument to measure these integrations. The seventeen indicators from the CCSEQ
that were used in this study to determine academic and social integration were suggested
by the Tinto model or validations of the Tinto model (Pascarella & Terenzini, 1979,
1983; Terenzini & Pascarella 1977, 1980) and by other researchers (Douzenis, 1994;
Moss & Young, 1995). The nine indicators of the CCSEQ that were used to figure the
patterns of academic integration in this study were the following:
1) GPA; 2) Time spent studying or preparing for classes; 3) Course activities (10-item scale); 4) Library activities (7-item scale); 5) Learning and study skills (9-item scale); 6) Writing activities (8-item scale); 7) Science activities (11-item scale); 8) Computer technology (8-item scale); 9) Experiences with faculty (9-item scale).
The eight indicators of the CCSEQ that were used to figure the patterns of social
integration were the following:
1) Student acquaintances (6-item scale);
98
2) Art, music, theater (9-item scale); 3) Hours spent on campus; 4) Clubs and organizations (7-item scale); 5) Athletics (6-item scale); 6) Counseling and career planning activities (8-item scale); 7) College environment (8-item scale); and 8) Estimate of gain (25-item scale).
Since no published research was located that had statistically verified that these
indicators are moderately or strongly correlated to each other and therefore actually do
measure academic and social integration, the researcher conducted factor analyses on
these integrations. After conducting the factor analyses on the pretest data, a different
combination of indicators made up the academic and social integration dependant
variables. Based on the results of the academic integration factor analyses, this
dependent variable did not include all nine of the original indicators. The academic
integration score was made up of the following six indicators: Course Activities, Library
Activities, Writing Activities, Science Activities (omit item 10), Computer Technology,
and Faculty Activities. The following three indicators were omitted from the academic
integration dependent variable: GPA, HoursStudying/Week, and Learning and Study
Skills. Moreover, based on the results of the social integration factor analyses, this
dependent variable did not include all eight of the original indicators. The social
integration dependent variable was made up of the following five indicators: Art, Music,
and Theater Activities (omit items 2 and 4); Clubs and Organizations; Athletic Activities
(omit 5); Counseling and Career Planning; and Estimate of Gains (omit items 5, 14, and
18). The following three indicators were omitted from the social integration dependent
variable: Student Acquaintances, Time Spent on Campus, and College Environment.
This study utilizing factor analyses to quantify and verify the academic and social
99
integration dependent variables should serve as a model to future researchers that use the
CCSEQ to measure academic and social integration.
The preferred statistical method for this study was analysis of covariance,
(ANCOVA), in which the posttest mean of the experimental group was compared to the
posttest mean of the control group with the pretest scores used as a covariate. Since all of
the underlying assumptions of ANCOVA were satisfied, this study used ANCOVA with
a pretest covariate. After an in-depth analysis of data, the finding of this study was that
there was no difference between the control and experimental groups on academic and
social integration.
To verify these results the researcher did two things. First, the researcher entered
the pretest and posttest data for the control and experimental groups twice to check for
possible data entry errors. The resulting academic and social integration scores and
means were the same both times. Second, the researcher conducted an additional
ANCOVA using the Cook’s distance as a diagnostic tool to remove outlier data. Once
again, the ANCOVA results using the pretest as a covariate revealed no statistical
significance.
Interpretation Related to Previous Research
Most of the studies on academic and social integration and the behaviors students
can engage in to achieve such integration were developed before the infusion of
technology into higher education. Only two studies were located during the extensive
literature review process that related some aspect of computer technology with academic
and social integration. The results of this study are coherent with both of these studies.
100
Gatz and Hirt (2000) conducted a study at a large, public, research university to
gain a better understanding of whether email was replacing traditional behaviors in which
college students engage to achieve academic and social integration. The results indicated
that while the participants did use email for some academic and social integration
purposes, the bulk of their email activity did not relate to either form of integration.
Ashmore (2000) examined computer engagement of students at a two-year school to
determine if computer engagement had an impact upon perceived growth and
development, and if this engagement had an effect on academic and social involvement.
The results showed that computer usage did not alter the effects of academic and social
involvement. This study on the impact of computer-mediated communication on the
academic and social integration of community college students also showed no
relationship between aspects of computer technology use, in particular CMC, and
academic and social integration.
Limitations
Because the researcher was unaware of any other instructors who augmented their
classes with computer-mediated communication at the community college used for this
study, the researcher was also the instructor of the research participants. As more
instructors and students are exposed to CMC, they will become more adept at using this
technology. Also, as CMC technology becomes more available and accessible,
community college instructors and students will gain a better understanding how CMC
could be used to impact integration.
Because the research participants from a cross section of three lectures and five
labs were randomly assigned to the control and experimental groups, there were limited
101
face to face interactions to accompany computer interactions and possibly some diffusion
of treatment between the groups. Also, since the students were randomly assigned to the
control and experimental groups they did not get to choose their research project.
Therefore, some of the experimental group participants may have viewed CMC
negatively from the beginning of the study which could have influenced their responses
to the CCSEQ. Different results may be observed if intact research groups were used.
The timeframe for data collection was one full semester (16 weeks). This
timeframe may have been too short to adequately measure academic and social
integration. Different results may be obtained if a two semester course was used or a
longitudinal study was conducted.
By design, the study was limited to public two-year college students enrolled in
the researcher’s Chemistry 1406 course. Different results may be observed if the study
incorporated other non-traditional students from different disciplines or independent two-
year college students. Also, different results may be observed if the study incorporated
higher education students from different types of institutions such as public research
universities or liberal arts colleges.
In terms of the quantitative research design, a questionnaire (i.e., the CCSEQ) was
applied as a specifically-designed data collection device for the community college
setting with perception bias of the respondents. The perception of the individual
completing the questionnaire is not a factual or quantifiable response. Perception is, by
nature, an individualistic and subjective method of judgment. While the statistical
analysis of the responses yields quantifiable data, it is necessary to remember that the
original information that generated the data was qualified by individual perception and
102
bias. The fact that the information gathered was by perception does not invalidate the
findings, but the information must be interpreted with caution.
Despite these limitations, this study examined for the first time the impact that
computer-mediated communication had on the academic and social integration of
community college students with supporting documentation through quantitative data
analysis. This study provided an experimental methodology that future researchers might
replicate or modify to further explore this topic. This study also documents the use of
factor analyses on the CCSEQ data to quantify and verify the academic and social
integration constructs.
Implications
Based on the conclusions to this study, it is important to examine the implications
of this study. This study has implications for both future research and practice. While
the investigation did not examine every aspect of the impact of computer-mediated
communication on academic and social integration, the results provided a basis for
further research on the topic.
The present study examined the impact of CMC on community college students at
one public two-year college enrolled in the researcher’s Chemistry 1406 course. An
investigation of CMC impact on academic and social integration incorporating other non-
traditional students from different disciplines or independent two-year college students
might reveal different results. Also, investigations at other types of institutions such as
public research and liberal arts institutions might reveal results that would strengthen the
body of research on this topic.
103
To increase sample size the experimental group consisted of students from a cross
section of three different lectures and five different labs. Therefore, participants did not
have face to face interactions with all of the CMC experimental group members to
accompany their computer interactions. A similar investigation with intact control and
experimental groups would add to the body of CMC research. Intact groups would
control for diffusion of treatment and possible negative attitudes resulting from random
assignment as well as facilitate a more cohesive CMC group.
While this study was designed to specifically examine the impact of CMC on
academic and social integration as measured by the CCSEQ, other studies might be
conducted with a broader focus. Research that explores other outcomes associated with
CMC among college students (e.g., development of critical thinking skills and
enhancement of pedagogy) might prove valuable.
This study answered the basic question of how CMC impacted academic and
social integration of community college students using ANCOVA with a pretest
covariate. Other studies might take into consideration other covariates such as age,
gender, race, grades, and reasons for attending college. These results would prove
valuable if specific groups were identified that were or were not academically and/or
socially integrated and further analysis could identify better ways to achieve integration.
Conclusions
Although research findings to date have documented that computer-mediated
communication (CMC) gets students involved, a substantial gap remained in determining
the impact of CMC on academic and social integration of community college students.
Because computer technology, specifically CMC, has proliferated within teaching and
104
learning in higher education and because of the importance of academic and social
integration, this study was significant in documenting through quantitative data analysis
the impact that CMC had on the integration of community college students. The
following research question was addressed: Does computer-mediated communication
have an impact on the academic and social integration of community college students as
measured by the CCSEQ? The study hypothesized that data analysis will show that there
will be no difference in the integrations reported by the control and experimental groups.
The overall approach was to conduct a pretest-posttest control-group experimental
study using CMC as the experimental treatment. The Community College Student
Experiences Questionnaire (CCSEQ) was given to collect data that were used to measure
the academic and social integration of the control and experimental groups. After an in-
depth analysis of data using descriptive statistics, factor analysis, and ANCOVA, the
finding of this study was that there is no statistically significant difference between the
control and experimental groups on their academic and social integration as measured by
the CCSEQ. In other words, CMC did not have a positive or negative impact on the
integration of community college students. This study examined for the first time the
impact that CMC had on the integration of community college students and provided an
experimental methodology that future researchers might replicate or modify to further
explore this topic. Because CMC will continue to increase as technology becomes more
available and accessible to faculty and students and because of the importance of
academic and social integration, further study on this relationship is vital to higher
education research.
105
APPENDIX A
COMMUNITY COLLEGE STUDENT EXPERIENCES QUESTIONNAIRE (CCSEQ)
106
107
108
109
110
111
112
113
114
APPENDIX B
BLACKBOARD LEARNING SYSTEM COMPUTER-MEDIATED
COMMUNICATION
115
116
117
118
119
120
121
122
123
124
APPENDIX C
CONTROL AND EXPERIMENTAL GROUP COMMUNICATIONS
125
CONTROL GROUP COMMUNICATION
126
Welcome to Chemistry 1406! During the many years I have been studying and teaching chemistry, I have found chemistry to be an exciting intellectual challenge and an extraordinarily rich and varied part of our cultural heritage. I hope that as you advance in your study of chemistry this semester that you will share with me some of that enthusiasm, excitement, and appreciation. I also hope that you will come to realize the importance of chemistry in your everyday life and in preparing for your future. Your first assignment for this course is to carefully read and reread the attached course information documents that are vital to your success in this course. These handouts include the following: chemistry orientation, syllabus, laboratory information, review sheets, etc. Pay close attention to the class project (10 percent of overall course grade) that has been randomly assigned to you and is due on or before _________________. Two times during this semester (weeks 1 and 15) you will be asked to voluntarily and anonymously complete the Community College Student Experiences Questionnaire (CCSEQ) as a part of your project. If you choose not to complete the CCSEQ, your project grade will NOT be affected. The CCSEQ results will be used in as part of a dissertation study in the Department of Higher Education at the University of North Texas under the direction of Dr. Jack Baier (940/565-3238). This study has been reviewed and approved by the UNT Committee for the Protection of Human Subjects (940/565-3940). The detailed use of the questionnaire results will be provided in writing at the completion of this semester. Here is some advice that will be helpful to you as you study and learn chemistry this semester:
• Keep up with your studying day to day. • Focus your study on the exam review statements. • Attend class regularly. • Keep good lecture notes within the textbook and review them carefully. • Skim topics in the text before they are covered in lecture. • After lecture, carefully read the topics covered in class and pay close attention to related
example problems. • Learn the language of chemistry. • Attempt all of the assigned problems and exercises. • If you do poorly, honestly analyze the reasons, eliminate the problems, get back up, shake
it off, and keep on charging! Feel free to contact me if you have any questions or concerns during this semester. You may reach me by office phone/voice mail (817) 515-3374 or by stopping by my office SEE 258 during posted office hours. Best Wishes Always, David Dollar Chemistry Instructor
127
Welcome to Chemistry 1406! During the many years I have been studying and teaching chemistry, I have found chemistry to be an exciting intellectual challenge and an extraordinarily rich and varied part of our cultural heritage. I hope that as you advance in your study of chemistry this semester that you will share with me some of that enthusiasm, excitement, and appreciation. I also hope that you will come to realize the importance of chemistry in your everyday life and in preparing for your future. The class project (10 percent of your overall course grade) that has been randomly assigned to you will be accessing and participating in computer-mediated communication. Two times during this semester (weeks 1 and 15) you will be asked to voluntarily and anonymously complete the Community College Student Experiences Questionnaire (CCSEQ) as a part of your project. If you choose not to complete the CCSEQ, your project grade will NOT be affected. The CCSEQ results will be used in as part of a dissertation study in the Department of Higher Education at the University of North Texas under the direction of Dr. Jack Baier (940/565-3238). This study has been reviewed and approved by the UNT Committee for the Protection of Human Subjects (940/565-3940). The detailed use of the questionnaire results will be provided in writing at completion of this semester. You will need access to the Internet either from home or on campus to complete your project. The most convenient places on campus with extensive operating hours and staff help are the Science Learning Center (SEE 264) or the Library. Your first assignment in this course is to begin your project by emailing me at [email protected] with the subject heading of Chemistry 1406 Student by the end of Week 1 or preferably by the end of today. If you do not have an email address, you are required to have one to complete your project. The most convenient place to get a free email address is at www.hotmail.com. The directions to help you create an email account are straightforward and easy to follow. After you have created it, remember to email me as your first assignment with the subject heading Chemistry 1406 Student. This email address is necessary so that you can be enrolled into the Chem1406 Course Website that will facilitate the computer-mediated communication for your project. Please do not hesitate to contact me by phone or during posted office hours if you need assistance. After you email me, I will enroll you into the Blackboard 5 Chem1406 Course Website and your username and password will be emailed to you. You may then access the computer-mediated communication at http://coursesites.blackboard.com. Add this website, Blackboard 5 Entry Page, to your web browser “favorites” for easier access throughout the semester. Select login and enter your username and password that was emailed to you and login. You should now see Chem1406 under the My Courses Section. Select Chem1406 and you are at the course webpage. The course information documents that will help you be successful in this course include the following: chemistry orientation, syllabus, laboratory information, review sheets, etc. and are located on the Chem1406 Website for you to access and print. Your project grade will be determined by your active and full participation in the computer-mediated communication throughout the entire semester. The details of your participation are more fully explained and detailed in the Announcements Section of the blackboard course website. Here is some advice that will be helpful to you as you study and learn chemistry this semester: keep up with your studying day to day; focus your study on the exam review statements; attend class regularly; keep good lecture notes within the textbook and review them carefully; skim topics in the text before they are covered in lecture; after lecture , carefully read the topics covered in class and pay close attention to related example problems; learn the language of chemistry; attempt all of the assigned problems and exercises; if you do poorly, honestly analyze the reasons, eliminate the problems, get back up, shake it off, and keep on charging! Feel free to contact me if you need help getting your email address or have any questions or concerns during this semester. The best way to reach me is through the email available through the computer-mediated communication course website. You may also reach me by office phone/voice mail (817) 515-3374 or by stopping by my office SEE 258 during posted office hours. Best Wishes Always, David Dollar Chemistry Instructor
128
Dear Participant: Thank you so much for voluntarily and anonymously completing the Community College Student Experiences Questionnaire (CCSEQ). The collective data from the questionnaire will be kept confidential and reported only as part of an aggregate. In other words, the data will in no way be a reflection of any one student but of a group of students. As you know, 10 percent of your grade for this semester was your project. The project you were assigned and have now completed was randomly assigned to you. The grade received was equitably based on the research paper you submitted or your participation in computer-mediated communication. The purpose of having two different projects was so that a study on the impact of computer-mediated communication on the academic and social integration of community college students could be conducted. Although research findings to date have documented that computer-mediated communication gets students involved, a substantial gap remains in determining the impact of computer-mediated communication on academic and social integration. Because computer technology, specifically computer-mediated communication, has proliferated within teaching and learning in higher education and because of the importance of academic and social integration, this study will be significant in documenting through quantitative data analysis the impact that computer-mediated communication has on the academic and social integration of community college students. The collective data from the CCSEQ will be used to measure the academic and social integration of the two project groups. This study hypothesizes that community college students who use computer-mediated communication will score significantly greater in academic and social integration than those who do not. Thank you so much for working so hard this semester and voluntarily and anonymously taking the Community College Student Experiences Questionnaire. The CCSEQ results will be used in as part of a dissertation study in the Department of Higher Education at the University of North Texas under the direction of Dr. Jack Baier (940/565-3238). This study has been reviewed and approved by the UNT Committee for the Protection of Human Subjects (940/565-3940). The results of this study entitled The Impact of Computer-Mediated Communication on the Academic and Social Integration of Community College Students will be available Fall 2003 and provided to you upon request. Best Wishes Always, David Dollar Chemistry Instructor
129
APPENDIX D
CONTROL AND EXPERIMENTAL GROUP RESPONSES TO CCSEQ
130
Table D.1 CCSEQ College Courses Items for the Control and Experimental Groups Item Percentage (%) 1. Number of courses taken in each general education area College Math (not remedial math) None One More than one Computer Literacy None One More than one Remedial English class or classes None One More than one English Composition (college level) None One More than one Fine Arts None One More than one Foreign Languages None One More than one Humanities None One More than one Remedial Math class or classes None One More than one Physical or Health Education None One More than one Sciences None One More than one
Control
63.8 25.5 10.6
59.6 31.9 8.5
59.6 17.0 23.4
14.9 36.2 48.9
61.7 31.9 6.4
83.0 10.6 6.4
51.1 21.3 27.7
36.2 48.9 14.9
44.7 36.2 19.1
6.4 25.5 68.1
Experimental
51.2 36.6 12.2
68.3 22.0 9.8
53.7 31.7 14.6
29.3 24.4 46.3
65.9 29.3 4.9
85.4 7.3 7.3
41.5 29.3 29.3
58.5 22.0 19.5
53.7 36.6 9.8
2.4 34.1 63.4
131
Social Sciences None One More than one Speech, Communications None One More than one
31.9 14.9 53.2
44.7 51.1 4.3
19.5 36.6 43.9
36.6 58.5 4.9
2. Working for an AA degree Yes No
21.3 78.7
14.6 85.4
3. Working for an AS degree Yes No
63.8 36.2
43.9 56.1
4. Working for a diploma Yes No
40.4 59.6
29.3 70.7
5. Working for a certificate Yes No
19.1 80.9
12.2 87.8
6. Plan to transfer to a four year college or university Yes No
57.4 42.6
65.9 34.1
132
Table D.2 CCSEQ Learning and Study Skills Instruction Received for the Control Group Item Percentage (%) Amount of instruction received in learning and study skills Memory skills None Some A lot Note taking skills None Some A lot Listening skills None Some A lot Speaking skills None Some A lot Writing skills None Some A lot Reading skills None Some A lot Test taking skills None Some A lot Time management skills None Some A lot Problem solving skills None Some A lot
Pretest
70.2 21.3 8.5
68.1 19.1 12.8
68.1 17.0 14.9
76.6 17.0 6.4
63.8 27.7 8.5
63.8 23.4 12.8
72.3 17.0 10.6
78.7 12.8 8.5
72.3 19.1 8.5
Posttest
42.6 36.2 21.3
48.9 29.8 21.3
51.1 19.1 29.8
53.2 29.8 17.0
38.3 48.9 12.8
53.2 29.8 17.0
48.9 25.5 25.5
53.2 27.7 19.1
46.8 34.0 19.1
133
Table D.3 CCSEQ College Activities items for the Control Group Item Percentage (%) 1. Participated in class discussions. Never Occasionally Often Very Often
Pretest
6.4 55.3 21.3 17.0
Posttest
10.6 55.3 23.4 10.6
2. Worked on a paper or project that combined ideas from different sources of information. Never Occasionally Often Very Often
17.0 34.0 38.3 10.6
2.1 21.3 51.1 25.5
3. Summarized major points and information from readings or notes. Never Occasionally Often Very Often
12.8 38.3 36.2 12.8
2.1 40.4 38.3 19.1
4. Tried to explain the material to another student. Never Occasionally Often Very Often
17.0 51.1 19.1 12.8
2.1 48.9 29.8 19.1
5. Did additional readings on topics that were introduced and discussed in class. Never Occasionally Often Very Often
29.8 51.1 14.9 4.3
14.9 53.2 19.1 12.8
6. Asked questions about points made in class discussions or readings. Never Occasionally Often Very Often
25.5 48.9 23.4 2.1
21.3 48.9 21.3 8.5
7. Studied course materials with other students. Never Occasionally Often Very Often
23.4 38.3 27.7 10.6
23.4 36.2 17.0 23.4
134
8. Applied principles and concepts learned in class to understand other problems or situations. Never Occasionally Often Very Often
19.1 38.3 29.8 12.8
17.0 44.7 14.9 23.4
9. Compared and contrasted different points of view presented in a course. Never Occasionally Often Very Often
21.3 48.9 25.5 4.3
25.5 42.6 19.1 12.8
10. Considered the accuracy and credibility of information from different sources. Never Occasionally Often Very Often
21.3 57.4 12.8 8.5
25.5 46.8 17.0 10.6
135
Table D.4 CCSEQ Library Activities Items for the Control Group Item Percentage (%) 1. Used the library as a quiet place to read or study material you brought with you. Never Occasionally Often Very Often
Pretest
34.0 25.5 21.3 19.1
Posttest
23.4 46.8 12.8 17.0
2. Read newspapers, magazines, or journals located in the library or on-line. Never Occasionally Often Very Often
48.9 29.8 14.9 6.4
34.0 44.7 12.8 8.5
3. Checked out books and other materials to read at home. Never Occasionally Often Very Often
51.1 36.2 8.5 4.3
55.3 38.3 4.3 2.1
4. Used the card catalogue or computer to find materials the library had on a topic. Never Occasionally Often Very Often
55.3 21.3 14.9 8.5
55.3 29.8 10.6 4.3
5. Prepared a bibliography or set of references for a term paper or report. Never Occasionally Often Very Often
38.3 25.5 29.8 6.4
27.7 31.9 27.7 12.8
6. Asked the librarian for help in finding materials on some topic. Never Occasionally Often Very Often
44.7 29.8 17.0 8.5
46.8 34.0 12.8 6.4
7. Found some interesting material to read just by browsing in the stacks. Never Occasionally Often Very Often
66.0 23.4 10.6 0.0
53.2 31.9 10.6 4.3
136
Table D.5 CCSEQ Faculty items for the Control Group Item Percentage (%) 1. Asked an instructor for information about grades, make-up work, assignments, etc. Never Occasionally Often Very Often
Pretest
14.9 53.2 19.1 12.8
Posttest
2.1 59.6 23.4 14.9
2. Talked briefly with an instructor after class about course content. Never Occasionally Often Very Often
29.8 44.7 17.0 8.5
19.1 53.2 12.8 14.9
3. Made an appointment to meet with an instructor in his/her office. Never Occasionally Often Very Often
61.7 25.5 8.5 4.3
51.1 38.3 4.3 6.4
4. Discussed ideas for a term paper or other class project with an instructor. Never Occasionally Often Very Often
44.7 38.3 12.8 4.3
34.0 53.2 8.5 4.3
5. Discussed your career and/or educational plans, interests, and ambitions with an instructor. Never Occasionally Often Very Often
55.3 31.9 6.4 6.4
57.4 31.9 6.4 4.3
6. Discussed comments an instructor made on a test or paper you wrote. Never Occasionally Often Very Often
51.1 38.3 6.4 4.3
57.4 36.2 2.1 4.3
7. Talked informally with an instructor about current events, campus activities, or other common interests. Never Occasionally Often Very Often
55.3 34.0 6.4 4.3
66.0 23.4 6.4 4.3
137
8. Discussed your school performance, difficulties or personal problems with an instructor. Never Occasionally Often Very Often
66.0 21.3 8.5 4.3
63.8 29.8 2.1 4.3
9. Used electronic mail (E-mail) to communicate with your instructor. Never Occasionally Often Very Often
48.9 29.8 12.8 8.5
38.3 34.0 17.0 10.6
138
Table D.6 CCSEQ Student Acquaintances Items for the Control Group Item Percentage (%) 1. Had serious discussions with students who were much older or much younger than you. Never Occasionally Often Very Often
Pretest
31.9 38.3 23.4 6.4
Posttest
29.8 40.4 23.4 6.4
2. Had serious discussions with students whose ethnic or cultural background was different from yours. Never Occasionally Often Very Often
29.8 40.4 23.4 6.4
31.9 31.9 21.3 14.9
3. Had serious discussions with students whose philosophy of life or personal values were very different from yours. Never Occasionally Often Very Often
38.3 42.6 14.9 4.3
36.2 31.9 17.0 14.9
4. Had serious discussions with students whose political opinions were very different from yours. Never Occasionally Often Very Often
59.6 27.7 12.8 0.0
40.4 42.6 10.6 6.4
5. Had serious discussions with students whose religious beliefs were very different from yours. Never Occasionally Often Very Often
38.3 46.8 12.8 2.1
38.3 31.9 17.0 12.8
6. Had serious discussions with students from a country different from yours. Never Occasionally Often Very Often
46.8 31.9 17.0 4.3
31.9 40.4 12.8 14.9
139
Table D.7 CCSEQ Art, Music, Theater Activities items for the Control Group Item Percentage (%) 1. Talked about art (painting, sculpture, architecture, artists, etc.) with other students at the college. Never Occasionally Often Very Often
Pretest
74.5 21.3 0.0 4.3
Posttest
66.0 25.5 0.0 8.5
2. Talked about music (classical, popular, musicians, etc.) with other students at the college. Never Occasionally Often Very Often
48.9 38.3 10.6 2.1
44.7 38.3 4.3 12.8
3. Talked about theater (plays, musicals, dance, etc.) with other students at the college. Never Occasionally Often Very Often
68.1 27.7 2.1 2.1
61.7 25.5 4.3 8.5
4. Attended an art exhibit on the campus. Never Occasionally Often Very Often
76.6 21.3 2.1 0.0
72.3 23.4 2.1 2.1
5. Attended a concert or other musical event at the college. Never Occasionally Often Very Often
76.6 19.1 2.1 2.1
72.3 17.0 8.5 2.1
6. Attended a play, dance, concert, or other theater performance at the college. Never Occasionally Often Very Often
70.2 25.5 4.3 0.0
70.2 21.3 4.3 4.3
7. Participated in an art exhibit, musical event, or theater performance at the college. Never Occasionally Often Very Often
87.2 10.6 2.1 0.0
89.4 6.4 0.0 4.3
140
8. Attended an OFF-CAMPUS art exhibit, musical event, or theater performance for course credit. Never Occasionally Often Very Often
85.1 6.4 6.4 2.1
76.6 17.0 4.3 2.1
9. Participated in an OFF-CAMPUS art exhibit, musical event, or theater performance for course credit. Never Occasionally Often Very Often
87.2 6.4 4.3 2.1
83.0 12.8 2.1 2.1
141
Table D.8 CCSEQ Writing Activities items for the Control Group Item Percentage (%) 1. Used a dictionary [or computer (word processor) spell-check/thesaurus] to look up the proper meaning, definition, and/or spelling of words. Never Occasionally Often Very Often
Pretest
12.8 31.9 25.5 29.8
Posttest
4.3 29.8 23.4 42.6
2. Prepared an outline to organized the sequence of ideas and points in a paper you were writing. Never Occasionally Often Very Often
21.3 23.4 27.7 27.7
10.6 31.9 23.4 34.0
3. Thought about grammar, sentence structure, paragraphs and word choice as you were writing. Never Occasionally Often Very Often
14.9 19.1 34.0 31.9
2.1 27.7 40.4 29.8
4. Wrote a rough draft of a paper or essay and revised it before handing it in. Never Occasionally Often Very Often
17.0 17.0 27.7 38.3
8.5 25.5 29.8 36.2
5. Used a computer (word processor) to write or type a paper. Never Occasionally Often Very Often
14.9 8.5 29.8 46.8
4.3 14.9 21.3 59.6
6. Asked other people to read something you wrote to see if it was clear to them. Never Occasionally Often Very Often
25.5 23.4 29.8 21.3
17.0 29.8 29.8 23.4
7. Spent at least 5 hours or more writing a paper. Never Occasionally Often Very Often
21.3 38.3 23.4 17.0
21.3 36.2 19.1 23.4
142
8. Asked an instructor for advice and help to improve your writing or about a comment he/she made on a paper you wrote. Never Occasionally Often Very Often
36.2 44.7 6.4 12.8
51.1 21.3 14.9 12.8
143
Table D.9 CCSEQ Science Activities items for the Control Group Item Percentage (%) 1. Memorized formulas, definitions, technical terms. Never Occasionally Often Very Often
Pretest
25.5 14.9 29.8 29.8
Posttest
0.0 14.9 38.3 46.8
2. Practiced to improve your skills in using laboratory equipment. Never Occasionally Often Very Often
31.9 31.9 23.4 12.8
12.8 27.7 29.8 29.8
3. Showed a classmate how to use a piece of scientific equipment. Never Occasionally Often Very Often
40.4 40.4 12.8 6.4
4.3 46.8 23.4 25.5
4. Attempted to explain an experimental procedure to a classmate. Never Occasionally Often Very Often
46.8 38.3 14.9 0.0
0.0 46.8 34.0 19.1
5. Tested your understanding of some scientific principle by seeing if you could explain it to another student. Never Occasionally Often Very Often
48.9 34.0 12.8 4.3
21.3 38.3 19.1 21.3
6. Completed an experiment/project using scientific methods. Never Occasionally Often Very Often
51.1 29.8 12.8 6.4
4.3 36.2 29.8 29.8
7. Talked about social and ethical issues related to science and technology such as energy, pollution, chemicals, genetics, etc. Never Occasionally Often Very Often
53.2 29.8 8.5 8.5
23.4 46.8 14.9 14.9
144
8. Used information you learned in a science class to understand some aspect of the world around you. Never Occasionally Often Very Often
34.0 25.5 25.5 14.9
12.8 42.6 25.5 19.1
9. Tried to explain to someone the scientific basis for environmental concerns about pollution, recycling, alternative forms of energy, etc. Never Occasionally Often Very Often
63.8 25.5 6.4 4.3
36.2 44.7 8.5 10.6
10. Did paid or volunteer work OFF-CAMPUS to help the environment after learning about environmental issues in class. Never Occasionally Often Very Often
93.6 6.4 0.0 0.0
78.7 14.9 6.4 0.0
11. Applied information or skills you learned in a science class to work (either volunteer or paid) outside of class. Never Occasionally Often Very Often
74.5 14.9 10.6 0.0
55.3 23.4 12.8 8.5
145
Table D.10 CCSEQ Athletic Activities items for the Control Group Item Percentage (%) 1. Followed a regular exercise program on campus. Never Occasionally Often Very Often
Pretest
59.6 19.1 12.8 8.5
Posttest
55.3 19.1 17.0 8.5
2. Sought athletic instruction. Never Occasionally Often Very Often
70.2 19.1 6.4 4.3
57.4 27.7 10.6 4.3
3. Attended an athletic event on campus. Never Occasionally Often Very Often
89.4 10.6 0.0 0.0
87.2 10.6 0.0 2.1
4. Coached or assisted with youth athletic programs on campus. Never Occasionally Often Very Often
91.5 6.4 0.0 2.1
93.6 4.3 0.0 2.1
5. Coached or assisted with OFF-CAMPUS youth athletic programs for course credit. Never Occasionally Often Very Often
97.9 2.1 0.0 0.0
93.6 4.3 2.1 0.0
6. Participated in a sport on campus. Never Occasionally Often Very Often
89.4 8.5 2.1 0.0
89.4 6.4 0.0 4.3
146
Table D.11 CCSEQ Computer Technology items for the Control Group Item Percentage (%) 1. Used E-mail to communicate with an instructor or other students about a course. Never Occasionally Often Very Often
Pretest
40.4 31.9 10.6 17.0
Posttest
25.5 25.5 34.0 14.9
2. Used the World Wide WEB or INTERNET [or other computer network] to get information for a class project or paper. Never Occasionally Often Very Often
17.0 12.8 34.0 36.2
6.4 10.6 27.7 55.3
3. Used a computer tutorial to learn material for a course or remedial program. Never Occasionally Often Very Often
51.1 27.7 8.5 12.8
44.7 19.1 23.4 12.8
4. Used computers in a group (cooperative) learning situation in class. Never Occasionally Often Very Often
57.4 27.7 10.6 4.3
51.1 25.5 12.8 10.6
5. Used a computer for some type of database management. Never Occasionally Often Very Often
51.1 25.5 17.0 6.4
40.4 23.4 23.4 12.8
6. Used a computer to analyze data for a class project. Never Occasionally Often Very Often
44.7 29.8 17.0 8.5
29.8 27.7 27.7 14.9
7. Used a computer to create graphs or charts for a class paper or project. Never Occasionally Often Very Often
55.3 25.5 17.0 2.1
42.6 23.4 19.1 14.9
147
8. Wrote an application using existing software or programming languages. Never Occasionally Often Very Often
83.0 12.8 4.3 0.0
63.8 14.9 12.8 8.5
148
Table D.12 CCSEQ Clubs and Organizations items for the Control Group Item Percentage (%) 1. Look for notices about campus events and student organizations. Never Occasionally Often Very Often
Pretest
63.8 25.5 6.4 4.3
Posttest
40.4 40.4 14.9 4.3
2. Read or asked about a student club or organization. Never Occasionally Often Very Often
70.2 23.4 2.1 4.3
57.4 31.9 4.3 6.4
3. Attended a meeting of a student club or organization. Never Occasionally Often Very Often
85.1 10.6 0.0 4.3
66.0 17.0 10.6 6.4
4. Assumed a leadership role (held an office, headed a committee, etc.) in a student organization or club. Never Occasionally Often Very Often
93.6 4.3 0.0 2.1
87.2 4.3 4.3 4.3
5. Participated in a campus project or event sponsored by a student organization or club. Never Occasionally Often Very Often
85.1 10.6 0.0 4.3
76.6 14.9 2.1 6.4
6. Participated in a project or event OFF-CAMPUS which was sponsored by a student organization or club. Never Occasionally Often Very Often
93.6 2.1 0.0 4.3
87.2 6.4 4.3 2.1
7. Participated in a project or event OFF-CAMPUS which was not sponsored by a student organization or club. Never Occasionally Often Very Often
87.2 6.4 0.0 6.4
89.4 4.3 4.3 2.1
149
Table D.13 CCSEQ Counseling and Career Planning items for the Control Group Item Percentage (%) 1. Talked with a counselor/advisor about courses to take, requirements, educational plans. Never Occasionally Often Very Often
Pretest
12.8 31.9 31.9 23.4
Posttest
10.6 46.8 27.7 14.9
2. Discussed your vocational interests, abilities and ambitions with a counselor/advisor. Never Occasionally Often Very Often
36.2 29.8 19.1 14.9
29.8 44.7 12.8 12.8
3. Read information about a particular 4-year college or university that you were interested in attending. Never Occasionally Often Very Often
23.4 29.8 25.5 21.3
10.6 42.6 27.7 19.1
4. Read materials about career opportunities. Never Occasionally Often Very Often
12.8 25.5 34.0 27.7
10.6 29.8 34.0 25.5
5. Made an appointment with a counselor or an advisor to discuss your plans for transferring to a 4-year college or university. Never Occasionally Often Very Often
61.7 14.9 12.8 10.6
42.6 34.0 12.8 10.6
6. Identified courses needed to meet the general education requirements of a 4-year college or university you are interested in attending. Never Occasionally Often Very Often
31.9 19.1 34.0 14.9
25.5 29.8 25.5 19.1
7. Talked with a counselor/advisor about personal matters related to your college performance. Never Occasionally Often Very Often
61.7 19.1 17.0 2.1
36.2 42.6 12.8 8.5
150
8. Have taken interest inventories or surveys (e.g. Strong-Campbell Interest Inventory, Kuder Occupational Interest Survey, etc.) to help you direct your career goals. Never Occasionally Often Very Often
80.9 14.9 2.1 2.1
53.2 31.9 8.5 6.4
151
Table D.14 CCSEQ Estimate of Gains items for the Control Group Item Percentage (%) 1. Acquiring knowledge and skills applicable to a specific job or type of work. Very Little Some Quite a Bit Very Much
Pretest
27.7 29.8 27.7 14.9
Posttest
0.0 51.1 29.8 19.1
2. Gaining information about career opportunities. Very Little Some Quite a Bit Very Much
17.0 29.8 34.0 19.1
2.1 38.3 36.2 23.4
3. Developing clearer career goals. Very Little Some Quite a Bit Very Much
10.6 25.5 36.2 27.7
0.0 40.4 34.0 25.5
4. Becoming acquainted with different fields of knowledge. Very Little Some Quite a Bit Very Much
17.0 34.0 38.3 10.6
4.3 31.9 44.7 19.1
5. Developing an understanding and enjoyment of art, music, and theater. Very Little Some Quite a Bit Very Much
66.0 21.3 8.5 4.3
46.8 29.8 12.8 10.6
6. Developing an understanding and enjoyment of literature (novels, stories, essays, poetry, etc.). Very Little Some Quite a Bit Very Much
53.2 25.5 12.8 8.5
34.0 27.7 25.5 12.8
7. Writing clearly and effectively. Very Little Some Quite a Bit Very Much
23.4 36.2 23.4 17.0
8.5 44.7 29.8 17.0
152
8. Presenting ideas and information effectively in speaking to others. Very Little Some Quite a Bit Very Much
27.7 36.2 29.8 6.4
14.9 31.9 31.9 21.3
9. Acquiring skills needed to use computers to access information from the library, the INTERNET, the World Wide WEB, or other computer networks. Very Little Some Quite a Bit Very Much
27.7 23.4 23.4 25.5
8.5 27.7 31.9 31.9
10. Acquiring skills needed to use computers to produce papers, reports, graphs, charts, tables, or data analysis. Very Little Some Quite a Bit Very Much
40.4 27.7 17.0 14.9
14.9 27.7 36.2 21.3
11. Becoming aware of different philosophies, cultures, and ways of life. Very Little Some Quite a Bit Very Much
29.8 42.6 14.9 12.8
19.1 31.9 31.9 17.0
12. Becoming clearer about my own values and ethical standards. Very Little Some Quite a Bit Very Much
34.0 23.4 25.5 17.0
17.0 21.3 36.2 25.5
13. Understanding myself-my abilities and interests. Very Little Some Quite a Bit Very Much
12.8 23.4 31.9 31.9
6.4 25.5 27.7 40.4
14. Understanding mathematical concepts such as probabilities, proportions, etc. Very Little Some Quite a Bit Very Much
38.3 36.2 14.9 10.6
21.3 36.2 25.5 17.0
153
15. Understanding the role of science and technology in society. Very Little Some Quite a Bit Very Much
21.3 31.9 27.7 19.1
2.1 25.5 40.4 31.9
16. Putting ideas together to see relationships, similarities, and differences between ideas. Very Little Some Quite a Bit Very Much
25.5 34.0 23.4 17.0
10.6 23.4 29.8 36.2
17. Developing the ability to learn on my own, pursue ideas, and find information I need. Very Little Some Quite a Bit Very Much
12.8 21.3 40.4 25.5
8.5 23.4 36.2 31.9
18. Developing the ability to speak and understand another language. Very Little Some Quite a Bit Very Much
74.5 8.5 6.4 10.6
51.1 10.6 27.7 10.6
19. Interpreting information in graphs and charts I see in newspapers, textbooks, and on TV. Very Little Some Quite a Bit Very Much
34.0 38.3 19.1 8.5
31.9 29.8 21.3 17.0
20. Developing an interest in political and economic events. Very Little Some Quite a Bit Very Much
51.1 31.9 14.9 2.1
38.3 29.8 23.4 8.5
21. Seeing the importance of history for understanding the present as well as the past. Very Little Some Quite a Bit Very Much
36.2 34.0 25.5 4.3
29.8 23.4 29.8 17.0
154
22. Learning more about other parts of the world and other people (Asia, Africa, South America, etc.). Very Little Some Quite a Bit Very Much
42.6 29.8 21.3 6.4
38.3 23.4 21.3 17.0
23. Understanding other people and the ability to get along with different kinds of people. Very Little Some Quite a Bit Very Much
21.3 17.0 29.8 31.9
10.6 29.8 25.5 34.0
24. Developing good health habits and physical fitness. Very Little Some Quite a Bit Very Much
34.0 21.3 27.7 17.0
19.1 21.3 38.3 21.3
25. Developing the ability to get along with others in different kinds of situations. Very Little Some Quite a Bit Very Much
19.1 19.1 34.0 27.7
14.9 17.0 27.7 40.4
155
Table D.15 CCSEQ College Environment items for the Control Group Item Percentage (%) 1. If you could start over again would you go to this college? Yes Maybe No
Pretest
93.6 6.4 0.0
Posttest
100.0 0.0 0.0
2. How many of the students you know are friendly and supportive of one another? All Most Some Few or none
14.9 51.1 27.7 6.4
29.8 59.6 8.5 2.1
3. How many of your instructors at this college do you feel are approachable, helpful, and supportive? All Most Some Few or none
27.7 59.6 10.6 2.1
40.4 57.4 2.1 0.0
4. How many of the college counselors, advisors, and department secretaries you have had contact with would you describe as helpful, considerate, knowledgeable? All Most Some Few or none
31.9 51.1 12.8 4.3
40.4 44.7 8.5 6.4
5. How many of your courses at this college would you describe as challenging, stimulating, and worthwhile? All Most Some Few or none
25.5 57.4 17.0 0.0
27.7 63.8 8.5 0.0
6. Do you feel that this college is a stimulating and often exciting place to be? All of the time Most of the time Some of the time Rarely or never
14.9 61.7 21.3 2.1
34.0 48.9 17.0 0.0
7. Are there places on the campus for you to meet and study with other students? Yes, ample places Yes, a few places No
57.4 40.4 2.1
63.8 29.8 6.4
156
8. Are there places on the campus for you to use computers and technology? Yes, ample places Yes, a few places No
85.1 14.9 0.0
85.1 14.9 0.0
157
Table D.17 CCSEQ Learning and Study Skills Instruction Received for the Experimental Group Item Percentage (%) Amount of instruction received in learning and study skills Memory skills None Some A lot Note taking skills None Some A lot Listening skills None Some A lot Speaking skills None Some A lot Writing skills None Some A lot Reading skills None Some A lot Test taking skills None Some A lot Time management skills None Some A lot Problem solving skills None Some A lot
Pretest
82.9 7.3 9.8
82.9 12.2 4.9
78.0 17.1 4.9
78.0 19.5 2.4
75.6 17.1 7.3
80.5 14.6 4.9
78.0 14.6 7.3
78.0 22.0 0.0
78.0 17.1 4.9
Posttest
75.6 14.6 9.8
73.2 12.2 14.6
70.7 22.0 7.3
80.5 14.6 4.9
73.2 19.5 7.3
75.6 14.6 9.8
78.0 17.1 4.9
80.5 12.2 7.3
78.0 12.2 9.5
158
Table D.18 CCSEQ College Activities Items for the Experimental Group Item Percentage (%) 1. Participated in class discussions. Never Occasionally Often Very Often
Pretest
7.3 53.7 26.8 12.2
Posttest
9.8 56.1 19.5 14.6
2. Worked on a paper or project that combined ideas from different sources of information. Never Occasionally Often Very Often
12.2 34.1 41.5 12.2
9.8 29.3 43.9 17.1
3. Summarized major points and information from readings or notes. Never Occasionally Often Very Often
19.5 31.7 39.0 9.8
7.3 36.6 41.5 14.6
4. Tried to explain the material to another student. Never Occasionally Often Very Often
22.0 39.0 24.4 14.6
4.9 39.0 34.1 22.0
5. Did additional readings on topics that were introduced and discussed in class. Never Occasionally Often Very Often
39.0 48.8 9.8 2.4
29.3 53.7 7.3 9.8
6. Asked questions about points made in class discussions or readings. Never Occasionally Often Very Often
22.0 43.9 29.3 4.9
7.3 56.1 26.8 9.8
7. Studied course materials with other students. Never Occasionally Often Very Often
19.5 34.1 34.1 12.2
17.1 48.8 12.2 22.0
159
8. Applied principles and concepts learned in class to understand other problems or situations. Never Occasionally Often Very Often
14.6 53.7 29.3 2.4
12.2 51.2 26.8 9.8
9. Compared and contrasted different points of view presented in a course. Never Occasionally Often Very Often
17.1 48.8 34.1 0.0
19.5 51.2 22.0 7.3
10. Considered the accuracy and credibility of information from different sources. Never Occasionally Often Very Often
24.4 39.0 34.1 2.4
14.6 53.7 19.5 12.2
160
Table D.19 CCSEQ Library Activities Items for the Experimental Group Item Percentage (%) 1. Used the library as a quiet place to read or study material you brought with you. Never Occasionally Often Very Often
Pretest
17.1 43.9 26.8 12.2
Posttest
34.1 36.6 14.6 14.6
2. Read newspapers, magazines, or journals located in the library or on-line. Never Occasionally Often Very Often
43.9 34.1 19.5 2.4
53.7 24.4 14.6 7.3
3. Checked out books and other materials to read at home. Never Occasionally Often Very Often
48.8 29.3 14.6 7.3
65.9 14.6 12.2 7.3
4. Used the card catalogue or computer to find materials the library had on a topic. Never Occasionally Often Very Often
39.0 31.7 24.4 4.9
56.1 29.3 4.9 9.8
5. Prepared a bibliography or set of references for a term paper or report. Never Occasionally Often Very Often
24.4 41.5 29.3 4.9
41.5 36.6 14.6 7.3
6. Asked the librarian for help in finding materials on some topic. Never Occasionally Often Very Often
41.5 43.9 9.8 4.9
56.1 29.3 9.8 4.9
7. Found some interesting material to read just by browsing in the stacks. Never Occasionally Often Very Often
70.7 17.1 9.8 2.4
73.2 19.5 2.4 4.9
161
Table D.20 CCSEQ Faculty Items for the Experimental Group Item Percentage (%) 1. Asked an instructor for information about grades, make-up work, assignments, etc. Never Occasionally Often Very Often
Pretest
9.8 41.5 36.6 12.2
Posttest
0.0 46.3 41.5 12.2
2. Talked briefly with an instructor after class about course content. Never Occasionally Often Very Often
19.5 39.0 36.6 4.9
14.6 58.5 12.2 14.6
3. Made an appointment to meet with an instructor in his/her office. Never Occasionally Often Very Often
41.5 36.6 17.1 4.9
61.0 26.8 9.8 2.4
4. Discussed ideas for a term paper or other class project with an instructor. Never Occasionally Often Very Often
26.8 56.1 12.2 4.9
41.5 41.5 12.2 4.9
5. Discussed your career and/or educational plans, interests, and ambitions with an instructor. Never Occasionally Often Very Often
48.8 26.8 22.0 2.4
41.5 46.3 4.9 7.3
6. Discussed comments an instructor made on a test or paper you wrote. Never Occasionally Often Very Often
29.3 48.8 17.1 4.9
48.8 34.1 14.6 2.4
7. Talked informally with an instructor about current events, campus activities, or other common interests. Never Occasionally Often Very Often
53.7 29.3 12.2 4.9
53.7 31.7 12.2 2.4
162
8. Discussed your school performance, difficulties or personal problems with an instructor. Never Occasionally Often Very Often
53.7 34.1 7.3 4.9
61.0 29.3 4.9 4.9
9. Used electronic mail (E-mail) to communicate with your instructor. Never Occasionally Often Very Often
39.0 39.0 12.2 9.8
9.8 34.1 26.8 29.3
163
Table D.21 CCSEQ Student Acquaintances Items for the Experimental Group Item Percentage (%) 1. Had serious discussions with students who were much older or much younger than you. Never Occasionally Often Very Often
Pretest
31.7 56.1 4.9 7.3
Posttest
31.7 36.6 22.0 9.8
2. Had serious discussions with students whose ethnic or cultural background was different from yours. Never Occasionally Often Very Often
31.7 46.3 14.6 7.3
34.1 36.6 19.5 9.8
3. Had serious discussions with students whose philosophy of life or personal values were very different from yours. Never Occasionally Often Very Often
43.9 39.0 14.6 2.4
39.0 36.6 12.2 12.2
4. Had serious discussions with students whose political opinions were very different from yours. Never Occasionally Often Very Often
51.2 41.5 2.4 4.9
43.9 34.1 12.2 9.8
5. Had serious discussions with students whose religious beliefs were very different from yours. Never Occasionally Often Very Often
46.3 39.0 7.3 7.3
43.9 36.6 7.3 12.2
6. Had serious discussions with students from a country different from yours. Never Occasionally Often Very Often
39.0 48.8 9.8 2.4
34.1 43.9 12.2 9.8
164
Table D.22 CCSEQ Art, Music, Theater Activities Items for the Experimental Group Item Percentage (%) 1. Talked about art (painting, sculpture, architecture, artists, etc.) with other students at the college. Never Occasionally Often Very Often
Pretest
75.6 17.1 4.9 2.4
Posttest
82.9 12.2 4.9 0.0
2. Talked about music (classical, popular, musicians, etc.) with other students at the college. Never Occasionally Often Very Often
48.8 36.6 9.8 4.9
63.4 26.8 9.8 0.0
3. Talked about theater (plays, musicals, dance, etc.) with other students at the college. Never Occasionally Often Very Often
63.4 24.4 9.8 2.4
68.3 19.5 9.8 2.4
4. Attended an art exhibit on the campus. Never Occasionally Often Very Often
75.6 19.5 4.9 0.0
85.4 12.2 2.4 0.0
5. Attended a concert or other musical event at the college. Never Occasionally Often Very Often
78.0 19.5 2.4 0.0
78.0 17.1 4.9 0.0
6. Attended a play, dance, concert, or other theater performance at the college. Never Occasionally Often Very Often
80.5 17.1 2.4 0.0
73.2 19.5 4.9 2.4
7. Participated in an art exhibit, musical event, or theater performance at the college. Never Occasionally Often Very Often
95.1 0.0 4.9 0.0
85.4 9.8 2.4 2.4
165
8. Attended an OFF-CAMPUS art exhibit, musical event, or theater performance for course credit. Never Occasionally Often Very Often
78.0 12.2 7.3 2.4
78.0 9.8 12.2 0.0
9. Participated in an OFF-CAMPUS art exhibit, musical event, or theater performance for course credit. Never Occasionally Often Very Often
85.4 9.8 2.4 2.4
92.7 4.9 2.4 0.0
166
Table D.23 CCSEQ Writing Activities Items for the Experimental Group Item Percentage (%) 1. Used a dictionary [or computer (word processor) spell-check/thesaurus] to look up the proper meaning, definition, and/or spelling of words. Never Occasionally Often Very Often
Pretest
9.8 24.4 31.7 34.1
Posttest
7.3 43.9 19.5 29.3
2. Prepared an outline to organized the sequence of ideas and points in a paper you were writing. Never Occasionally Often Very Often
9.8 29.3 34.1 26.8
22.0 29.3 31.7 17.1
3. Thought about grammar, sentence structure, paragraphs and word choice as you were writing. Never Occasionally Often Very Often
9.8 17.1 46.3 26.8
12.2 26.8 36.6 24.4
4. Wrote a rough draft of a paper or essay and revised it before handing it in. Never Occasionally Often Very Often
9.8 14.6 29.3 46.3
14.6 24.4 34.1 26.8
5. Used a computer (word processor) to write or type a paper. Never Occasionally Often Very Often
7.3 12.2 17.1 63.4
12.2 14.6 31.7 41.5
6. Asked other people to read something you wrote to see if it was clear to them. Never Occasionally Often Very Often
14.6 34.1 14.6 36.6
19.5 58.5 14.6 7.3
7. Spent at least 5 hours or more writing a paper. Never Occasionally Often Very Often
17.1 24.4 26.8 31.7
36.6 24.4 19.5 19.5
167
8. Asked an instructor for advice and help to improve your writing or about a comment he/she made on a paper you wrote. Never Occasionally Often Very Often
22.0 34.1 24.4 19.5
48.8 22.0 19.5 9.8
168
Table D.24 CCSEQ Science Activities Items for the Experimental Group Item Percentage (%) 1. Memorized formulas, definitions, technical terms. Never Occasionally Often Very Often
Pretest
17.1 14.6 29.3 39.0
Posttest
0.0 12.2 26.8 61.0
2. Practiced to improve your skills in using laboratory equipment. Never Occasionally Often Very Often
29.3 29.3 19.5 22.0
9.8 22.0 36.6 31.7
3. Showed a classmate how to use a piece of scientific equipment. Never Occasionally Often Very Often
29.3 43.9 17.1 9.8
4.9 46.3 26.8 22.0
4. Attempted to explain an experimental procedure to a classmate. Never Occasionally Often Very Often
34.1 43.9 12.2 9.8
4.9 39.0 31.7 24.4
5. Tested your understanding of some scientific principle by seeing if you could explain it to another student. Never Occasionally Often Very Often
43.9 31.7 17.1 7.3
19.5 36.6 29.3 14.6
6. Completed an experiment/project using scientific methods. Never Occasionally Often Very Often
34.1 29.3 22.0 14.6
7.3 39.0 34.1 19.5
7. Talked about social and ethical issues related to science and technology such as energy, pollution, chemicals, genetics, etc. Never Occasionally Often Very Often
51.2 29.3 9.8 9.8
36.6 36.6 14.6 12.2
169
8. Used information you learned in a science class to understand some aspect of the world around you. Never Occasionally Often Very Often
31.7 26.8 26.8 14.6
12.2 29.3 36.6 22.0
9. Tried to explain to someone the scientific basis for environmental concerns about pollution, recycling, alternative forms of energy, etc. Never Occasionally Often Very Often
65.9 22.0 2.4 9.8
58.5 29.3 7.3 4.9
10. Did paid or volunteer work OFF-CAMPUS to help the environment after learning about environmental issues in class. Never Occasionally Often Very Often
85.4 12.2 0.0 2.4
90.2 7.3 2.4 0.0
11. Applied information or skills you learned in a science class to work (either volunteer or paid) outside of class. Never Occasionally Often Very Often
58.5 29.3 7.3 4.9
61.0 22.0 7.3 9.8
170
Table D.25 CCSEQ Athletic Activities Items for the Experimental Group Item Percentage (%) 1. Followed a regular exercise program on campus. Never Occasionally Often Very Often
Pretest
70.7 17.1 2.4 9.8
Posttest
65.9 17.1 9.8 7.3
2. Sought athletic instruction. Never Occasionally Often Very Often
85.4 12.2 2.4 0.0
78.0 17.1 4.9 0.0
3. Attended an athletic event on campus. Never Occasionally Often Very Often
95.1 2.4 2.4 0.0
95.1 4.9 0.0 0.0
4. Coached or assisted with youth athletic programs on campus. Never Occasionally Often Very Often
100.0 0.0 0.0 0.0
100.0 0.0 0.0 0.0
5. Coached or assisted with OFF-CAMPUS youth athletic programs for course credit. Never Occasionally Often Very Often
97.6 2.4 0.0 0.0
100.0 0.0 0.0 0.0
6. Participated in a sport on campus. Never Occasionally Often Very Often
95.1 4.9 0.0 0.0
97.6 2.4 0.0 0.0
171
Table D.26 CCSEQ Computer Technology Items for the Experimental Group Item Percentage (%) 1. Used E-mail to communicate with an instructor or other students about a course. Never Occasionally Often Very Often
Pretest
43.9 29.3 19.5 7.3
Posttest
4.9 24.4 34.1 36.6
2. Used the World Wide WEB or INTERNET [or other computer network] to get information for a class project or paper. Never Occasionally Often Very Often
12.2 17.1 22.0 48.8
2.4 29.3 29.3 39.0
3. Used a computer tutorial to learn material for a course or remedial program. Never Occasionally Often Very Often
68.3 17.1 9.8 4.9
51.2 24.4 14.6 9.8
4. Used computers in a group (cooperative) learning situation in class. Never Occasionally Often Very Often
46.3 29.3 17.1 7.3
41.5 26.8 14.6 17.1
5. Used a computer for some type of database management. Never Occasionally Often Very Often
46.3 22.0 24.4 7.3
34.1 26.8 24.4 14.6
6. Used a computer to analyze data for a class project. Never Occasionally Often Very Often
46.3 19.5 22.0 12.2
39.0 29.3 22.0 9.8
7. Used a computer to create graphs or charts for a class paper or project. Never Occasionally Often Very Often
36.6 39.0 17.1 7.3
48.8 26.8 12.2 12.2
172
8. Wrote an application using existing software or programming languages. Never Occasionally Often Very Often
70.7 26.8 0.0 2.4
70.7 19.5 2.4 7.3
173
Table D.27 CCSEQ Clubs and Organizations Items for the Experimental Group Item Percentage (%) 1. Look for notices about campus events and student organizations. Never Occasionally Often Very Often
Pretest
63.4 24.4 9.8 2.4
Posttest
41.5 43.9 12.2 2.4
2. Read or asked about a student club or organization. Never Occasionally Often Very Often
65.9 22.0 12.2 0.0
61.0 26.8 7.3 4.9
3. Attended a meeting of a student club or organization. Never Occasionally Often Very Often
82.9 14.6 0.0 2.4
80.5 14.6 0.0 4.9
4. Assumed a leadership role (held an office, headed a committee, etc.) in a student organization or club. Never Occasionally Often Very Often
85.4 9.8 2.4 2.4
90.2 7.3 0.0 2.4
5. Participated in a campus project or event sponsored by a student organization or club. Never Occasionally Often Very Often
85.4 9.8 2.4 2.4
78.0 9.8 2.4 9.8
6. Participated in a project or event OFF-CAMPUS which was sponsored by a student organization or club. Never Occasionally Often Very Often
90.2 0.0 7.3 2.4
90.2 2.4 2.4 4.9
7. Participated in a project or event OFF-CAMPUS which was not sponsored by a student organization or club. Never Occasionally Often Very Often
85.4 4.9 7.3 2.4
95.1 0.0 2.4 2.4
174
Table D.28 CCSEQ Counseling and Career Planning Items for the Experimental Group Item Percentage (%) 1. Talked with a counselor/advisor about courses to take, requirements, educational plans. Never Occasionally Often Very Often
Pretest
12.2 46.3 17.1 24.4
Posttest
14.6 53.7 22.0 9.8
2. Discussed your vocational interests, abilities and ambitions with a counselor/advisor. Never Occasionally Often Very Often
31.7 34.1 19.5 14.6
29.3 46.3 12.2 12.2
3. Read information about a particular 4-year college or university that you were interested in attending. Never Occasionally Often Very Often
29.3 19.5 26.8 24.4
26.8 26.8 31.7 14.6
4. Read materials about career opportunities. Never Occasionally Often Very Often
22.0 24.4 31.7 22.0
17.1 29.3 41.5 12.2
5. Made an appointment with a counselor or an advisor to discuss your plans for transferring to a 4-year college or university. Never Occasionally Often Very Often
48.8 24.4 14.6 12.2
46.3 26.8 19.5 7.3
6. Identified courses needed to meet the general education requirements of a 4-year college or university you are interested in attending. Never Occasionally Often Very Often
22.0 22.0 26.8 29.3
31.7 17.1 34.1 17.1
175
7. Talked with a counselor/advisor about personal matters related to your college performance. Never Occasionally Often Very Often
56.1 29.3 4.9 9.8
63.4 14.6 14.6 7.3
8. Have taken interest inventories or surveys (e.g. Strong-Campbell Interest Inventory, Kuder Occupational Interest Survey, etc.) to help you direct your career goals. Never Occasionally Often Very Often
61.0 31.7 7.3 0.0
61.0 26.8 7.3 4.9
176
Table D.29 CCSEQ Estimate of Gains Items for the Experimental Group Item Percentage (%) 1. Acquiring knowledge and skills applicable to a specific job or type of work. Very Little Some Quite a Bit Very Much
Pretest
17.1 26.8 34.1 22.0
Posttest
9.8 36.6 36.6 17.1
2. Gaining information about career opportunities. Very Little Some Quite a Bit Very Much
4.9 43.9 29.3 22.0
14.6 24.4 36.6 24.4
3. Developing clearer career goals. Very Little Some Quite a Bit Very Much
4.9 31.7 26.8 36.6
17.1 19.5 31.7 31.7
4. Becoming acquainted with different fields of knowledge. Very Little Some Quite a Bit Very Much
12.2 31.7 48.8 7.3
12.2 36.6 34.1 17.1
5. Developing an understanding and enjoyment of art, music, and theater. Very Little Some Quite a Bit Very Much
56.1 26.8 9.8 7.3
56.1 29.3 12.2 2.4
6. Developing an understanding and enjoyment of literature (novels, stories, essays, poetry, etc.). Very Little Some Quite a Bit Very Much
41.5 31.7 17.1 9.8
56.1 24.4 14.6 4.9
7. Writing clearly and effectively. Very Little Some Quite a Bit Very Much
19.5 31.7 41.5 7.3
31.7 29.3 29.3 9.8
177
8. Presenting ideas and information effectively in speaking to others. Very Little Some Quite a Bit Very Much
22.0 34.1 36.6 7.3
29.3 39.0 19.5 12.2
9. Acquiring skills needed to use computers to access information from the library, the INTERNET, the World Wide WEB, or other computer networks. Very Little Some Quite a Bit Very Much
22.0 34.1 14.6 29.3
26.8 36.6 14.6 22.0
10. Acquiring skills needed to use computers to produce papers, reports, graphs, charts, tables, or data analysis. Very Little Some Quite a Bit Very Much
26.8 31.7 19.5 22.0
41.5 24.4 14.6 19.5
11. Becoming aware of different philosophies, cultures, and ways of life. Very Little Some Quite a Bit Very Much
17.1 43.9 24.4 14.6
31.7 43.9 12.2 12.2
12. Becoming clearer about my own values and ethical standards. Very Little Some Quite a Bit Very Much
24.4 29.3 19.5 26.8
31.7 31.7 26.8 9.8
13. Understanding myself-my abilities and interests. Very Little Some Quite a Bit Very Much
9.8 29.3 26.8 34.1
22.0 22.0 29.3 26.8
14. Understanding mathematical concepts such as probabilities, proportions, etc. Very Little Some Quite a Bit Very Much
26.8 41.5 17.1 14.6
29.3 34.1 26.8 9.8
178
15. Understanding the role of science and technology in society. Very Little Some Quite a Bit Very Much
19.5 26.8 31.7 22.0
9.8 29.3 34.1 26.8
16. Putting ideas together to see relationships, similarities, and differences between ideas. Very Little Some Quite a Bit Very Much
14.6 34.1 31.7 19.5
17.1 39.0 26.8 17.1
17. Developing the ability to learn on my own, pursue ideas, and find information I need. Very Little Some Quite a Bit Very Much
4.9 24.4 46.3 24.4
17.1 26.8 26.8 29.3
18. Developing the ability to speak and understand another language. Very Little Some Quite a Bit Very Much
65.9 17.1 7.3 9.8
56.1 26.8 12.2 4.9
19. Interpreting information in graphs and charts I see in newspapers, textbooks, and on TV. Very Little Some Quite a Bit Very Much
34.1 36.6 4.9 24.4
41.5 39.0 14.6 4.9
20. Developing an interest in political and economic events. Very Little Some Quite a Bit Very Much
53.7 31.7 12.2 2.4
63.4 22.0 14.6 0.0
21. Seeing the importance of history for understanding the present as well as the past. Very Little Some Quite a Bit Very Much
39.0 26.8 26.8 7.3
43.9 31.7 12.2 12.2
179
22. Learning more about other parts of the world and other people (Asia, Africa, South America, etc.). Very Little Some Quite a Bit Very Much
41.5 36.6 14.6 7.3
43.9 31.7 17.1 7.3
23. Understanding other people and the ability to get along with different kinds of people. Very Little Some Quite a Bit Very Much
9.8 36.6 31.7 22.0
24.4 43.9 19.5 12.2
24. Developing good health habits and physical fitness. Very Little Some Quite a Bit Very Much
36.6 31.7 14.6 17.1
29.3 24.4 34.1 12.2
25. Developing the ability to get along with others in different kinds of situations. Very Little Some Quite a Bit Very Much
17.1 34.1 24.4 24.4
22.0 29.3 34.1 14.6
180
Table D.30 CCSEQ College Environment Items for the Experimental Group Item Percentage (%) 1. If you could start over again would you go to this college? Yes Maybe No
Pretest
82.9 14.6 2.4
Posttest
82.9 14.6 2.4
2. How many of the students you know are friendly and supportive of one another? All Most Some Few or none
14.6 53.7 29.3 2.4
9.8 68.3 17.1 4.9
3. How many of your instructors at this college do you feel are approachable, helpful, and supportive? All Most Some Few or none
26.8 56.1 17.1 0.0
39.0 48.8 12.2 0.0
4. How many of the college counselors, advisors, and department secretaries you have had contact with would you describe as helpful, considerate, knowledgeable? All Most Some Few or none
22.0 51.2 14.6 12.2
19.5 34.1 26.8 19.5
5. How many of your courses at this college would you describe as challenging, stimulating, and worthwhile? All Most Some Few or none
17.1 68.3 14.6 0.0
19.5 48.8 26.8 4.9
6. Do you feel that this college is a stimulating and often exciting place to be? All of the time Most of the time Some of the time Rarely or never
12.2 43.9 41.5 2.4
9.8 68.3 17.1 4.9
7. Are there places on the campus for you to meet and study with other students? Yes, ample places Yes, a few places No
36.6 58.5 4.9
48.8 41.5 9.8
181
8. Are there places on the campus for you to use computers and technology? Yes, ample places Yes, a few places No
53.7 46.3 0.0
70.7 29.3 0.0
182
Table D.31 CCSEQ Quality of Effort (QE) Scales Means (and Standard Deviations) for the Control and Experimental Groups Scales Control Experimental n=47 n=41 M(SD) M(SD) 1. QE Course Learning Possible range (10-40)
Pretest22.47
(6.068)
Posttest24.38
(6.302)
Pretest22.51
(5.124)
Posttest 24.00
(5.758) 2. QE Library Possible range (7-28)
12.85
(4.695)
13.06
(4.474)
13.27
(4.566)
12.05
(5.045) 3. QE Faculty Possible range (8-32)
15.85
(6.079)
16.30
(5.187)
17.59
(5.371)
17.68
(4.906) 4. QE Student Acquaintances Possible range (6-24)
11.06
(4.131)
12.34
(5.172)
10.73
(4.068)
11.85
(4.902) 5. QE Art, Music, and Theater Possible range (6-24)
11.89
(3.389)
12.87
(4.812)
11.98
(4.009)
11.61
(3.707) 6. QE Writing Possible range (8-32)
20.91
(7.147)
22.06
(6.084)
22.95
(6.786)
19.85
(5.977) 7. QE Science Possible range (9-36)
19.68
(6.751)
26.45
(6.999)
21.71
(8.177)
25.93
(5.918) 8. QE Computer Technology Possible range (8-32)
14.94
(5.383)
17.66
(5.928)
15.56
(5.201)
17.54
(5.221)
183
Table D.32 CCSEQ Scores for Academic Integration Dependent Variables for the Control Group Dependent Variables Control Experimental n=47 n=41 M(SD) M(SD) 1. Grade point average (self-reported)
Pretest2.96
(1.978)
Posttest 2.96
(1.978)
Pretest 2.83
(1.935)
Posttest2.83
(1.935) 2. Time spent studying or preparing for classes
1.89 (1.026)
1.89 (1.026)
2.17 (.998)
2.17 (.998)
3. Course activities (10-item scale)
22.47 (6.068)
24.38 (6.302)
22.51 (5.124)
24.00 (5.758)
4. Library activities (7-item scale)
12.85 (4.695)
13.06 (4.474)
13.27 (4.566)
12.05 (5.045)
5. Learning and study skills (9-item scale)
12.57 (5.003)
15.47 (5.897)
11.34 (4.157)
11.90 (4.549)
6. Writing activities (8-item scale)
20.91 (7.147)
22.06 (6.084)
22.95 (6.786)
19.85 (5.977)
7. Science activities (11-item scale)
19.68 (6.751)
26.45 (6.999)
21.71 (8.177)
25.93 (5.918)
8. Computer technology (8-item scale)
14.94 (5.383)
17.66 (5.928)
15.56 (5.201)
17.54 (5.221)
9. Experiences with faculty (9-item scale)
15.85 (6.079)
16.30 (5.187)
17.59 (5.371)
17.68 (4.906)
184
Table D.33 CCSEQ Scores for Social Integration Dependent Variables for the Control Group Dependent Variables Control Experimental n=47 n=41 M(SD) M(SD) 1. Student acquaintances (6-item scale)
Pretest 11.06
(4.131)
Posttest 12.34
(5.172)
Pretest 10.73
(4.068)
Posttest11.85
(4.902) 2. Art, music, theater (9-item scale)
11.89 (3.389)
12.87 (4.812)
11.98 (4.009)
11.61 (3.707)
3. Hours spent on campus
2.13 (1.279)
2.13 (1.279)
1.98 (.935)
1.98 (.935)
4. Clubs and organizations (7-item scale)
8.89 (4.071)
10.04 (4.457)
9.12 (3.926)
9.54 (4.112)
5. Athletics (6-item scale)
7.53 (2.292)
7.96 (2.750)
6.83 (1.482)
6.93 (1.539)
6. Counseling and career planning activities (8-item scale)
16.89 (5.798)
17.77 (5.692)
17.39 (6.344)
16.63 (6.272)
7. College environment (8-item scale)
23.30 (2.562)
24.53 (2.244)
22.07 (2.944)
22.20 (3.635)
8. Estimate of gain (25-item scale)
55.94 (17.605)
64.38 (16.700)
58.54 (15.961)
55.29 (17.212)
185
APPENDIX E
INSTITUTIIONAL REVIEW BOARD APPROVAL
186
187
REFERENCES
Abrahamowicz, D. (1988). College involvement, perceptions, and satisfaction: A study of membership in study organizations. Journal of College Student Development, 29, 233-238.
Acebo, S., Burrus, B.G., & Kanter, M. (1998). “Most wired” college tells of journey to the information age. Community College Journal, 69(1), 12-19.
Ackerman, S.P. (1990). A comparison of a sub-population of Santa Monica
College students to other community college students in the southern California area: An analysis of the results from the Community College Student Experiences Questionnaire. (Report for Santa Monica Community College District). Santa Monica: CA. (ERIC Document Reproduction Service No. ED 315 132)
Aitken, N. (1982). College student performance, satisfaction, and retention: Specification and estimation of a structural model. Journal of Higher Education, 53, 32-50.
Allen, D.F. (1986). Attrition at a commuter institution: A path analytic validation of Tinto’s theoretical model of college withdrawal. Paper presented at the meeting of the American College Personnel Association, Los Angeles, CA.
Allen, D.F. & Nelson, J.M. (1989). Tinto’s model of college withdrawal applied to women in two institutions. Journal of Research and Development in Education, 22(3), 1-11. Althaus, S.L. (1997). Computer-mediated communication in the university classroom: An experiment with on-line discussions. Communication Education, 46,158-174. Ashmore, N.J.M. (2000). The relationship between computer engagement and estimate of gains for students of a two-year college. (Doctoral dissertation, University of Memphis, 2000). Dissertation Abstracts International, A61/11, ISBN: 0-493-02911-7. Aslanian, C.B. (1997). Community colleges put their best faces forward for students of tomorrow. Community College Journal, 68(3), 17-20. Astin, A.W. (1977). Four critical years: Effects of college on beliefs, attitudes, and knowledge. San Francisco: Jossey-Bass. Astin, A. W. (1984). Student involvement: A developmental theory for higher education. Journal of College Student Personnel, 25(4), 297-308.
188
Astin, A.W. (1985). Achieving educational excellence: A critical assessment of priorities and practices in higher education. San Francisco: Jossey-Bass. Astin, A.W. (1993). What matters in college? Four critical years revisited. San Francisco: Jossey-Bass. Baker, G.A., III (1998). Keeping all generations happy: The Xers, boomers, and beyond. Community College Journal, 68(5), 10-17. Baker, G.A., III (1999). Building the comprehensive community college. Community College Journal, 69(4), 32-39. Baumgart, N., & Johnston, J. (1977). Attrition at an Australian university: A case study. Journal of Higher Education, 48, 553-570. Bean, J.P. (1980). Dropouts and turnover: The synthesis and test of a causal model. Research in Higher Education, 12(1), 155-87. Bean, J.P. (1982). Student attrition, intentions, and confidence: Interaction effects in a path model. Research in Higher Education, 17(4), 291-320. Bean, J.P. & Metzner, B.S. (1985). A conceptual model of nontraditional undergraduate student attrition. Review of Educational Research, 55(4), 485-540.
Berge, Z.L. & Collins, M.P. (1995). Introduction. In Z.L. Berge & M.P. Collins (Eds.), Computer-mediated communication and the online classroom: Overview and perspectives: Volume one (pp. 1-10). Cresskill, NJ: Hampton Press, Inc.
Berger, J.B., & Braxton, J.M. (1998). Revising Tinto’s interactionalist theory of
student departure through theory elaboration: Examining the role of organizational attributes in the persistence process. Research in Higher Education, 39(2), 103-119.
Bigelow, D. (1993). Information superhighways: Will they reach community
colleges? Community College Journal, 64(2), 22-25. Borglum, K., & Kubala, T. (2000). Academic and social integration of
community college students: A case study. Community College Journal of Research and Practice, 24, 567-576.
Bowen, H.R. (1977). Investment in learning. San Francisco: Jossey-Bass. Bower, B. (1998). Instructional computer use in the community college: A
discussion of the research and its implications. Journal of Applied Research in Community College 6(1), 59-66.
189
Braxton, J.M., & Brier, E.M. (1989). Melding organizational and interactional theories of student attrition: A path analytic study. Review of Higher Education, 13(1), 47-61.
Braxton, J.M., Brier, E.M., & Hossler, D. (1988). The influence of student
problems on student withdrawal decisions: An autopsy on “autopsy studies.” Research in Higher Education, 28(3), 241-253.
Braxton, J.M., Sullivan, A.V., & Johnson, R.M. (1997). Appraising Tinto’s theory
of college student departure. In John C. Smart (ed.), Higher education: Handbook of theory and research, (vol. XII, pp. 107-164). New York: Agathon.
Braxton, J.M., Vester, N., & Hossler, D. (1995). Expectations for cllege and
student persistence. Research in Higher Education, 36, 595-612.
Bryant, D.W. (1994). The battle of instructional effectiveness. Community College Journal, 65(3), 16-23.
Bryant, D.W. (1998). Understanding the parallax of community college status. Community College Journal, 69(1), 32-35.
Burnett, D.S. (1996). The relationship of student success to involvement in
student activities in a two-year institution. (Doctoral dissertation, University of Southern California, 1996). Dissertation Abstracts International, 57 07A.
Cabrera, A.F., Castaneda, M.B., Nora, A., & Hengstler, D. (1992). The
convergence between two theories of college persistence. Journal of Higher Education, 63(2), 143-164.
Cabrera, A.F., Nora, A., & Castaneda, M.B. (1992). The role of finances in the
persistence process: A structural model. Research in Higher Education, 33(5), 571-593. Campbell, D.T., &L Stanley, J.C. (1981). Experimental and quasi-experimental
designs for research. Boston: Houghton Mifflin. Cash, R.W., & Bissel, H.L. (1985). Testing Tinto’s model of attrition on the
church-related campus. Paper presented at the Association for Institutional Research Annual Forum, Portland, OR.
Chapman, G. (1998). Factors affecting student attitudes and use of computer-
mediate communication in traditional college courses. Journal of Instruction Delivery Systems, 12(4), 21-25.
Chickering, A.W., & Ehrmann, S.C. (1996). Implementing the seven principles:
Technology as lever. AAHE (American Association of Higher Education) Bulletin, 49(2), 3-6.
190
Chickering, A.W., & Gamson, Z.F. (1987). Seven principles for good practice in undergraduate education. AAHE (American Association of Higher Education) Bulletin 39(7), 3-7.
Chickering, A.W., & Reisser, L. (1993). Education and identity, (2nd Ed.). San
Francisco: Jossey-Bass. Clay-Warner, J. & Marsh, K. (2000). Implementing computer-mediated
communication in the college classroom. Journal of Educational Computing Research, 23(3), 257-274.
Cohen, A.M. & Brawer, F.B. (1991). The American community college. San
Francisco: Jossey-Bass. Cousineau, J., & Landon, B. (1989, May). Measuring academic outcomes…and
identifying what influences them. Paper presented at the Association for Institutional Research Forum. Baltimore, MD. (ERIC Document Reproduction Service No. ED-308-790)
Davis, T.M., & Murrell, P.H. (1990). Joint factor analysis of the College Student
Experiences Questionnaire and the ACT-Comp objective exam. Research in Higher Education, 31(5), 425-441.
Doucette, D. (1994). Transforming teaching and learning using information
technology: A report from the field. Community College Journal, 65(2), 18-25. Douzenis, C. (1994). The Community College Student Experiences
Questionnaire: Introduction and application. Community College Journal of Research and Practice, 18, 261-268.
Douzenis, C. (1996). The relationship of quality of effort and estimate of
knowledge gain among community college students. Community College Review, 24(3), 27-36.
Douzenis, C., & Murrell, P.H. (1992, May). An analysis of student’s experiences
at selected community colleges in Tennessee: Findings from the Community College Student Experiences Questionnaire. Paper presented at The Association for Institutional Research Forum, Atlanta, GA.
Dowaliby, F.J., Garrison, W.M., & Dagel, D. (1993). The student integration
survey: Development of an early alert assessment and reporting system. Research in Higher Education, 34(4), 513-531.
Durkheim, E. (1951). Suicide. Glencoe, IL: The Free Press.
191
Elkins, S.A., Braxton, J.M., & James, G.W. (1998, May). Tinto’s separation stage and its influence on first-semester college student persistence. Paper presented at the Association for International Research Forum, Minneapolis, MN.
Ethington, C.A., Guthrie, A.M., & Lehman, P.W. (2001). CCSEQ: Test manual
and comparative data (3rd ed.). University of Memphis Center for the Study of Higher Education: Memphis, TN.
Everett, D.R. & Ahern, T.C. (1994). Computer-mediated communication as a
teaching tool: A case study. Journal of Research on Computing in Education, 26(3), 336-357.
Faith, E.S., & Murrell, P.H. (1992). The social and academic integration of black
and white students: A pilot study of four two-year institutions in West Tennessee using the CCSEQ. Memphis, TN: Memphis State University, Center for the Study of Higher Education.
Feldman, K., & Newcomb, T. (1969). The impact of college on students. San
Francisco: Jossey-Bass. Fitch, R.T. (1991). The interpersonal values of students at differing levels of
extracurricular involvement. Journal of College Student Development, 32(1), 24-30. Flowers, I., Pascarella, E.T., & Pierson, C.T. (2000). Information technology use
and cognitive outcomes in the first year of college. Journal of Higher Education, 71, 637-667.
Friedlander, J. (1993). Are we using instructional technology effectively?
Evaluative/feasibility report (ERIC Document Reproduction Service No. ED 360 008) Friedlander, J., & MacDougall, P. (1992). Achieving student success through
student involvement. Community College Review, 20(1), 20-28. Friedlander, J., Murrell, P.H., & MacDougall, P.R. (1993). The Community
College Student Experiences Questionnaire, in T. Banta & Associates, (eds.) Making a difference: Outcomes of a decade of assessment in higher education (pp. 196-210). San Francisco: Jossey-Bass.
Friedlander, J., Pace, C.R., & Lehman, P.W. (1990).Community College Student
Experiences Questionnaire. Los Angeles, CA: Center for the Study of Evaluation, University of California, Los Angeles.
Gall, M.D., Borg, W.R., and Gall, J.P. (1996). Educational research: An
introduction (6th ed.). Longman Publishers: Longman, NY.
192
Gatz, L.B., & Hirt, J.B. (2000). Academic and social integration in cyberspace: Students and e-mail. The Review of Higher Education, 23(3), 299-318.
Glover, J.W. (1996, November). Campus environment and student involvement as
predictors of outcomes of the community college experience. Paper presented at the Association for the Study of Higher Education Annual Meeting. Memphis, TN. (ERIC Document Reproduction Service No. ED-402-831)
Glover, J.S. & Murrell, P. H. (1998). Campus environment and student
involvement as predictors of outcomes of the community college experience. Journal of Applied Research in the Community College 6(1), 5-13.
Green, K.C. (1996). Campus computing 1996: The seventh national survey of
desktop computing in higher education. Encino, CA: Campus Computing. Harasim, L. (1990). Online education: An environment for collaboration and
intellectual amplification. In L. Harasim (Ed.), Online education: Perspectives on a new environment (pp. 39-63). New York: Praeger.
Heterick, R.C. (Ed.). (1993). Reengineering teaching and learning in higher
education: Sheltered groves, Camelog, windmills, and malls (Professional Paper Series # 10). Boulder, CO: CAUSE. (ERIC Document Reproduction Service No. ED 359 921)
Hinkle, D.E., Wiersma, W., & Jurs, S. G. (1998). Applied statistics for the
behavioral sciences (4th ed.). Boston: Houghton Mifflin Company. Holden, M., & Mitchell, W. (1993). The future of computer-mediated
communication in higher education. EDUCOM Review, 28(2), 31-37. Hopkins, E. (1998, April). Some interesting facts on higher education. Peterson’s
Guide to Distance Learning (on-line). Available on World Wide Web: http://www.petersons.com.
Kienzl, G. & Li, Y. (1997). Computer technology at community colleges (AACC
Research Brief AACC-RB-97-2). Washington, DC: American Association of Community Colleges.
Kim, M., & Alvarez, R. (1995). Women-only colleges: Some unanticipated
consequences. The Journal of Higher Education, 66(6), 641-668. Knight, W.E. (1992, July). Report of the results of the Community College
Student Experiences Questionnaire: Report 1: Background and composite results. OH: Kent State University, Regional Campuses, Office of Academic Assessment and Evaluation Services. (ERIC Document Reproduction Service No. ED346 911)
193
Knight, W.E. (1994, June). Influences on the academic, career, and personal gains and satisfaction of community college students. Paper presented at the Association for Institutional Research Annual Forum. (ERIC Document Reproduction Service No. ED-373-644)
Kuh, G.D. (1981). Indices of quality in the undergraduate experience. AAHE-
ERIC/Higher Education Research (Report No. 4), Washington, DC: American Association for Higher Education.
Kuh, G.D. (1993). In their own words: What students learning outside theh
classroom. American Educational Research Journal, 30, 277-304. Kuh, G.D. (1995). The other curriculum, out-of-class experiences associated with
student learning and personal development. Journal of Higher Education, 66(2), 123-155. Kuh, G.D., & Hu, S. (2001). The relationship between computer and information
technology use, selected learning and person development outcomes, and other college experiences. Journal of College Student Development, 42(3), 217-32.
Kuh, G.D., & Vesper, N. (2001). Do computers enhance or detract from student
learning? Research in Higher Education, 42, 87-102. Kuh, G.D., Schuh, J.H., Whitt, E.J., Andreas, R.E., Lyons, J.W., Strange, C.C.,
Krehbiel, L.E., & MacKay, K.A. (1991). Involving colleges: Successful approaches to fostering student learning and development outside the classroom. San Francisco: Jossey-Bass.
Land, W.A. & Haney, J.J. (1989, November). The academic achievement of junio
college students and computer assisted instruction. Paper presented at the Conference of the Mid-South Educational Research Association, Little Rock, AR. (ERIC Document Reproduction Service No. ED 317 191)
Langhorst, S.A. (1997). Changing the channel: Community colleges in the
information age. Community College Review 25(3), 55-72. Lazarick, L. (1998). Managing the computer invasion. Community College
Journal, 68(5), 26-29. Lehman, P.W., Ethington, C.A., & Polizzi, T.B. (1995). Community College
Student Experiences Questionnaire: Test manual and comparative data (Second edition). The University of Memphis, Center for the Study of Higher Education. Memphis, TN.
Luna, C.J. & McKenzie, J. (1997). Testing multimedia in the community college
classroom. T.H.E. Journal, 24(7), 78-81.
194
Lundeberg, M.A., & Moch, S.D. (1995). Influence of social interaction on cognition: Connected learning in science. The Journal of Higher Education 66(3), 312-335.
MacGregor, J. (1991). What difference do learning communities make?
Washington Center News 6, 4-9. Maryland Longitudinal Study (1987, Fall). Students who become nonpersisters:
Who, when, why, and to do what. The Astin index: One approach to predicting persistence at UMCP four years after initial enrollment. (Research reports No. 5 and 9). The University of Maryland College Park, Maryland Longitudinal Study Steering Committee.
Matthews, R. (1994). Enriching teaching and learning in learning communities. In
Terry O’Banion and Associates (Eds.), Teaching and Learning in the Community College. Washington, DC: Community College Press.
McComb, M. (1994). Benefits of computer-mediated communication in college
courses. Communication Education, 43, 159-170. Mijangos, J.H. (2001). Nontraditional students’ learning and developmental
experiences at two-year institutions: An assessment of Hispanic/Latino(a) students’ experiences at selected community colleges in Iowa (Doctoral dissertation, Iowa State University, 2001). Dissertation Abstracts International, A 62/02, ISBN: 0-493-12191-9.
Miller, T.K., & Jones, J.D. (1981). Out of class activities. In W.A. Chickering
(Ed.), The modern American college: Responding to the new realities of diver students and a changing society. San Francisco: Jossey-Bass.
Morrison, J.L. (1999). The role of technology in education today and tomorrow:
An interview with Kenneth Green, Part II. On the Horizon, 7(1), 2-5. Moss, R.L. & Young, R.B. (1995). Perceptions about the academic and social
integration of underprepared students in an urban community college. Community College Review 22(4), 47-61.
Munro, B.H. (1981). Dropouts from higher education: Path analysis of a national
sample. American Education Research Journal, 18(2), 133-141. Murrell, P.H., & Glover, J.W. (1996). The community college experience:
Assessing process and progress. Community College Journal of Research and Practice, 20, 199-200.
Myers, R.L. (2001). Persistence of technical degree seekers. Unpublished doctoral
dissertation, University of Memphis, TN.
195
Napoli, A.R., & Wortman, P.M. (1996). A meta-analysis of the impact of academic and social integration of persistence of community college students. Journal of Applied Research in the Community College, 4(1), 5-21.
Pace, C.R. (1979). Measuring outcomes of college: Fifty years of findings and
recommendations for the future. San Francisco: Jossey-Bass. Pace, C.R. (1981, April). Measuring the quality of undergraduate education. Paper
presented at the Annual Meeting of the American Education Research Association, Los Angeles, CA.
Pace, C.R. (1982, May). Achievement and quality of student effort. Paper
presented at a meeting of the National Commissions on Excellence in Education, Washington, D.C.
Pace, C.R. (1984). Measuring the quality of student effort. Los Angeles: UCLA,
Center for the Study of Evaluation. Pace, C.R. (1996, May). A model for interpreting responses to the College
Student Experiences Questionnaire. Paper presented at the Association for Institutional Research Forum, Albuquerque, NM.
Pace, C.R. (1998). Recollections and reflections. In J.C. Smart (ed.), Higher
education: Handbook of theory and research, (vol. XIII, pp. 1-34). New York: Agathon Press.
Paine, N. (1996). The role of the community college in the age of the internet.
Community College Journal, 67(1), 32-37. Pascarella. E.T. (1985). College environmental influences on learning and
cognitive development: A critical review and synthesis. In J.C. Smart (ed.), Higher education: Handbook of theory and research. (vol. I, pp. 1-61). New York: Agathon Press.
Pascarella, E.T., & Chapman, D.W. (1983a). Validation of a theoretical model of
college withdrawal: Interaction effects in a multi-institutional sample. Research in Higher Education, 19(1), 25-48.
Pascarella, E.T., & Chapman, D.W. (1983b). A multi-institutional path analysis of
Tinto’s model of college withdrawal. American Educational Research Journal, 20, 87-102.
Pascarella, E.R., Duby, P., & Iverson, B. (1983). A test and reconceptualization of
a theoretical model of college withdrawal in a commuter institution setting. Sociology of Education, 56(2), 88-100.
196
Pascarella, E.T., Smart, J.C., & Ethington, C.A. (1986). Long-term persistence of two-year college students. Research in Higher Education, 24(1), 47-71.
Pascarella, E.T., & Terenzini, P.T. (1979). Interaction effects in Spady’s and
Tinto’s conceptual model of college dropout. Sociology of Education, 52(4), 197-210. Pascarella, E.T., & Terenzini, P.T. (1980). Predicting freshman persistence and
voluntary dropout decisions from a theoretical model. Journal of Higher Education, 51(1), 60-75.
Pasacrella, E.T., & Terenzini, P.T. (1983). Predicting voluntary freshman year
persistence/withdrawal behavior in a residential university: A path analytic validation of Tinto’s model. Journal of Higher Education, 52, 60-75.
Pascarella, E.T., & Terenzini, P.T. (1991). How college affects students: Findings
and insights from twenty years of research. San Francisco: Josey-Bass. Pascarella, E.T., & Terenzini, P.T. (1998). Studying college students in the 21st
century: Meeting new challenges. The Review of Higher Education, 21, 151-165. Pavel, M. (1991). Assessing Tinto’s model of institutional departure using
American Indian and Alaskan Native longitudinal data. Paper presented at the Annual Meeting of the Association for the Study of Higher Education, Boston, MA.
Peters, O. (1992). Some observations on dropping out in distance education.
Distance Education, 13(2), 234-269. Phelps, D.G. (1994). What lies ahead for community colleges as we hurtle toward
the 21st century. Community College Journal, 65(1), 22-25. Polizzi, T.B., & Ethington, C.A. (1998). Factors affecting gains in career
preparation: A comparison of vocational groups. Community College Journal of Research and Practice, 22(1), 39-52.
Preston, D.L. (1993, May). Using the CCSEQ in institutional effectiveness: The
role of goal commitment and students’ perceptions of gains. Paper presented at the Association for Institutional Research Forum, Chicago, IL. (ERIC Document Reproduction Service No. ED-360-022)
Preston, D.L. (1998). LONESTAR, CCSEQ, other data, and institutional
effectiveness at Brazosport College. Paper presented at the Thirty-eighth Association for Institutional Research Forum, Minneapolis, MN. (ERIC Reproduction Service Document ED 419 564)
197
Privateer, P.M. (1999). Academic technology and the future of higher education, strategic paths taken and not taken. Journal of Higher Education, 70(1), 60-80.
Raisman, N.A. (1999). Leave the field of dreams! Successful strategies for
marketing the community college. Community College Journal, 69(4), 14-19. Rayman, P., & Brett, B. (1995). Women science majors: What makes a difference
in persistence after graduation? The Journal of Higher Education, 66(4), 388-414. Ross, C.M. (1992). A study of academic and social integration in predicting
student persistence at a residential, two-year college (Doctoral dissertation, Florida State University, 1992). Dissertation Abstracts International, 53.
Russo, P. (1993). Struggling for knowledge: Students, collaborative learning, and
community college. Unpublished doctoral dissertation. Syracuse, NY: Syracuse University.
Santoro, G.M. (1995). What is computer-mediated communication. In Z.L. Berge
& M.P. Collins (Eds.), Computer-mediated communication and the online classroom: Overview and perspectives: Volume one (pp. 11-27). Cresskill, NJ: Hampton Press, Inc.
Simone, B.S. (1992). Computers in the curriculum. Community College Journal,
63(2), 4-5. Smith, B.M. (1993). The effect of quality of effort on persistence among
traditional-age community college students. Community College Journal of Research and Practice, 17, 103-122.
Smith, E.R., & Baxter, B. (1994). Community colleges and quality: A new
approach to an old subject. Community College Journal, 65(3), 37-40. Spady, W. (1970). Dropouts from higher education: An interdisciplinary review
and synthesis. Interchange, 1, 64-85. Spady, W. (1971). Dropouts from higher education: Toward an empirical model.
Interchange, 2, 38-62. Stage, F.K. (1988). University attrition: LISREL with logistic regression for the
persistence criterion. Research in Higher Education, 29(4), 343-357. Stage, F.K. (1990). Research on college students: Commonality, difference, and
direction. Review of Higher Education, 13(3), 249-258.
198
Stewart, D.L. (1995, August). Community college student experiences report: Perspectives of the 1995 graduating class, Blue Ridge Community College. Weyers Cave, VA: Blue Ridge Community College, Office of Institutional Research. (ERIC Document Reproduction Service No. ED-402-960)
Study Group on the Conditions of Excellence in American Higher Education.
(1984). Involvement in learning: Realizing the potential of American higher education. Washington, DC: National Institute of Education.
Summary of the 1999 CBC CCSEQ with comparisons to 1996 results and 1999
statewide responses. (2000). (Report for the Columbia Basin College Office of Institutional Research and Marketing). Pasco, WA. (ERIC Document Reproduction Service No. ED 455 853).
Swigart, T.E., & Ethington, C.E. (1998). Ethnic differences in estimates of gains
made by community college students. Community College Journal of Research and Practice, 22(8), 703-714.
Sworder, S. (1992). Analysis of the survey of student experiences at Saddleback
College via the Community College Student Experiences Questionnaire (CCSEQ). (Report for the 1992 Saddleback College Accreditation). Mission Veijo: CA. (ERIC Document Reproduction Service No. ED 339 445)
Terenzini, P.T., & Pascarella, E.T. (1977). Voluntary freshman attrition and
patterns of social and academic integration in a university: A test of a conceptual model. Research in Higher Education, 6, 25-43.
Terenzini, P.T., & Pascarella, E.T. (1980). Toward the validation of Tinto’s
model of college student attrition: A review of recent studies. Research in Higher Education, 12(3), 271-282.
Terenzini, P.T., Pascarella, E.T., Theophilides, C., & Lorang, W. (1985). A
replication of a path analytic validation of Tinto’s theory of college student attrition. Review of Higher Education, 8, 319-340.
Tinto, V. (1975). Dropout from higher education: A theoretical synthesis of recent
research. Review of Educational Research, 45(1), 89-125. Tinto, V. (1982). Limits of theory and practice in student attrition. Journal of
Higher Education, 53(6), 687-700. Tinto, V. (1987). Leaving college: Rethinking the causes and cures of student
attrition. The University of Chicago Press, Chicago.
199
Tinto, V., & Russo, P. (1993). A longitudinal study of the coordinated studies program at Seattle Central Community College. A study by the National Center on Postsecondary Teaching, Learning, and Assessment. University Park: Pennsylvania State University.
Tinto, V., Russo, P., & Kadel, S. (1994, February/March). Constructing
educational communities: Increasing retention in challenging circumstances. AACC Journal, 26-29.
Travis, J.E., & Travis, D.F. (1999). Survey of presidents reveals new trends in community college focus. Community College Journal, 69(4), 20-25.
United States Department of Education, Office of Vocational and Adult Education, Community College Liaison Office (1997, October). Investing in quality, affordable education for all Americans: A new look at community colleges.
Williamson, D.R. & Creamer, D.G. (1988). Student attrition in 2- and 4-year
colleges: Application of a theoretical model. Journal of College Student Development 29(3), 210-217.
Winter, D., McClelland, D., & Stewart, A. (1981). A new case for the liberal arts:
Assessing institutional goals and student development. San Francisco: Jossey-Bass. Wolfe, J.S. (1993). Institutional integration, academic success, and persistence of
first-year commuter and resident students. Journal of College Student Development, 34, 321-326.
Yaskin, D., & Gilfus, S. (2002). Blackboard 5: Introducing the Blackboard 5:
Learning system. Blackboard Learning System [On-line]. Available: www.blackboard.com.