chapter 15 assessment basics - columbia collegeawe.comm.gocolumbia.edu/basis_skills_handbook/chapter...

50
Chapter 15 1 Chapter 15 Course Assessment Basics: Evaluating Your Construction Primary Authors Janet Fulks Bakersfield College (Faculty) Marcy Alancraig, Cabrillo College (Faculty) With thanks for contributions from: Bakersfield College Academic Development/Math Department Cabrillo College English Department Louise Barbato, LA Mission College (Faculty) Dianne McKay, Mission College (Faculty) The Mission College Reading Department

Upload: others

Post on 25-Sep-2020

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 1

Chapter 15

Course Assessment Basics: Evaluating Your Construction

Primary Authors Janet Fulks Bakersfield College (Faculty) Marcy Alancraig, Cabrillo College (Faculty) With thanks for contributions from: Bakersfield College Academic Development/Math Department Cabrillo College English Department Louise Barbato, LA Mission College (Faculty) Dianne McKay, Mission College (Faculty) The Mission College Reading Department

Page 2: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 2

Chapter 15

Course Assessment Basics: Evaluating Your Construction

First off, due to the hard work of community college faculty across the state, and especially Student Learning Outcome coordinators, student learning outcomes and their assessment may already be in place at your campus. If so, your task as basic skills faculty, student services providers and/or administrators is to find out about what has been done in your area. Many schools have written SLOs, but have not yet figured out how to assess them. If that is the case for you, go to the Onward to Assessment portion of this chapter. If SLOs and assessments are already in place in your college, and you are more than familiar with them, but have questions about assessing complex and complicated programs like learning communities or reading and writing labs, go to Chapter 16 Advanced Assessment: Multiple Measures. If you are new to this entire work, wondering just what the x!?$%& a student learning outcome is anyway, stay right here.

Let’s start with a definition, written by one of the primary authors of this chapter and posted in the SLO Workbooks on the Cabrillo College SLO website. “Student learning outcomes (SLOs) describe the:

� knowledge � skills � abilities � attitudes � beliefs � opinions � values that students have attained by the end of any set of college experiences – classes, occupational programs, degrees and certificates and encounters with Student Services or the Library. The stress is on what students can DO with what they have learned, resulting in some sort of product that can be evaluated.”1

1 “Student Learning Outcomes and Instructional Planning: A Workbook” Cabrillo College,

http://pro.cabrillo.edu/SLOs pg 40

Page 3: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 3

The 2002 Accreditation standards ask faculty to articulate student learning outcomes for each course, each occupational program and each degree and certificate that the school offers. In addition, they must also define them for Student Services and the Library. Then, they must design assessment activities that provide students with an opportunity to demonstrate what they have learned.

A bit about SLOs versus Course Objectives “But we’ve always had course objectives in our course outline of record,” you think. “What’s the difference between them and an SLO?” Good question! Student learning outcomes for the classroom describe the knowledge, skills, abilities or attitudes that a student can demonstrate by the end of your course. They address higher- level thinking skills. “But wait,” you say. “We’re talking about basic skills courses.” Yes, but basic skills courses also require students to think critically, to analyze, evaluate and synthesize, as do all higher education classes. The very same thinking skills are put to use, though the students are not grappling with the specific academic discipline at the same level of sophistication as in transfer classes. The Cabrillo SLO website defines the differences between course objectives and SLOs as the following, including the chart shown below: “When trying to define Student learning outcomes for a course, think of the big picture. SLOs: � Describe the broadest goals for the class, ones that require higher-level thinking abilities.

� Require students to synthesize many discreet skills or areas of content. � Ask them to then produce something - papers, projects, portfolios, demonstrations, performances, art works, exams etc. – that applies what they have learned.

� Require faculty to evaluate or assess the product to measure a student’s achievement or mastery of the outcomes.

Course objectives are on smaller scale, describing small, discrete skills or “nuts and bolts” that require basic thinking skills. Think of objectives as the building blocks used to produce whatever is assessed to demonstrate mastery of an outcome. Objectives can be practiced and assessed individually, but are usually only a portion of an overall project or application.”2

Objectives Outcomes

Scope Skills, tools, or content to engage and explain a particular subject

Overarching results - subsequent learning

Target Details of content coverage and activities which make up a course curriculum.

Higher level thinking skills that integrate the content and activities.

2 ibid pg 41

Page 4: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 4

Major Influence

Input – nuts and bolts Output – Observable evidence (behavior, skill, or discrete useable knowledge) of learning.

Number Objectives can be numerous, specific, and detailed to direct the daily activities and material.

SLOs are limited in number (5-9) to facilitate modification and improvement of teaching and learning.

Are you still confused? Look at these Outcomes and Objectives from a basic skills reading course at Mission College. Note how these fall into the categories in the table above.

Upon completion of Reading 961 (two levels below College Level English) the student will: 1. Utilize vocabulary skills to comprehend assigned readings. 2. Determine and differentiate main ideas and supporting details in assigned readings. 3. Make appropriate inferences in assigned readings

Reading 961 objectives: 1. Apply knowledge of vocabulary commonly used in college reading, writing, and speaking. 2. Identify main idea in assigned readings. 3. Identify supporting details in assigned readings. 4. Identify organizational patterns and relationships of ideas in assigned readings. 5. Utilize graphic organizers (mapping, outlining, summarizing) as a method of organizing ideas in prose reading.

6. Apply contextual clues as a method of improving comprehension through informing vocabulary in assigned readings.

7. Apply critical thinking skills including distinguishing fact from opinion, making inferences, and identifying author’s purpose and tone in assigned readings.

8. Apply reading and study techniques to enhance comprehension of college textbooks Can you see that the objectives are small discrete skills that build to the overall course outcomes? Here are a few more OUTCOMES from other basic skills courses. Cabrillo College English 255: Basic Writing (two levels below College Level English) 1. Write short paragraphs and essays demonstrating basic sentence-level competency and culminating in a portfolio.

2. Comment on idea and writing strategies in reading assignments.

Bakersfield College Math 50: Modern College Arithmetic and Pre-Algebra 1. Demonstrate the ability to add, subtract, multiply, and divide whole numbers, integers, fractions, mixed numbers, and decimals.

2. Solve Linear Equations by: a) Using the Addition/Subtraction property of equality b) Using the Multiplication/Division property of equality. c) Using both of the above properties together.

Page 5: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 5

3. Translate English sentences to algebraic equations. 4. Simplify mathematical statements using the correct order of operations. 5. Calculate the perimeter and area of rectangles and triangles. Calculate the area and circumference of a circle.

6. Find equivalent forms of number (i.e. change fractions to decimals, change percents to fractions, change fractions to percents, change decimals to fractions, change decimals to percents, change percents to decimals, change mixed numbers to improper fractions, change improper fractions to mixed numbers).

7. Round whole numbers and decimals appropriately as directed. 8. Apply the concept of percent to real-world application such as sales tax, discount, and simple interest.

LA Mission College Course SLOs and ESL/English PROGRAM SLOs

Page 6: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 6

Cosumnes River College has an excellent short power point detailing the process of

developing ESL SLOs at

http://research.crc.losrios.edu/Marchand%20SLO%20Presentation.ppt#1

Onward to Assessment

Student learning outcomes are only the beginning. An SLO is an empty phrase without some attempt to assess or measure it. It is a building that has never been constructed. Once the walls have been raised and the interior has been finished,

someone must walk the floors and make sure that everything works. In the construction industry, that job belongs to a building inspector who certifies that the building is safe for use. In education, it is the faculty’s role, whether in the classroom or providing a student service. Assessment is a process where someone asks, “What are the results of this effort? Can anything be improved?” Rather than dependent on an outsider, educators must be the ones to design and create assessment processes and determine how to use that data to improve teaching and learning.

So what is assessment?

First, remember WYMIWYG (WHAT YOU MEASURE IS WHAT YOU GET). Every time you assess a skill or knowledge, you are communicating that the information on that test or assignment is the most important information for your students to know. (It is why we get that irritating question in every class, “Will this be on the test?”) Indeed, that is the way it should be. We should assess

Defining (and Re-assessing) Assessment: A Second Try T. A. Angelo, (1995) AAHE Bulletin no.48, p.7.

"Assessment is an ongoing process aimed at understanding and improving student learning. It involves

• making our expectations explicit and public;

• setting appropriate criteria and high standards for learning quality;

• systematically gathering, analyzing, and interpreting evidence to determine how well performance matches those expectations and standards; and

• using the resulting information to document, explain, and improve performance.

When it is embedded effectively within larger institutional systems, assessment can help us

• focus our collective attention,

• examine our assumptions, and

• create a shared academic culture dedicated to assuring and improving the quality of higher education. "

Page 7: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 7

what our students are able to do based on the outcomes we desire and at the level (higher order thinking skills) that we expect. The way you assess your students and the data you collect to improve teaching and learning will help you focus on the important and improvable aspects of your work. Here’s an example of a college that has institutionalized the asking and answering of assessment questions.

“City College of San Francisco—a much different, much larger institution—has developed a Web-based Decision Support System. The DSS contains data from 1998 through the present on student enrollment, student demand for classes, departmental productivity, student success as measured by grades, course completion, degrees and certificates, and student characteristics, all of which are available in response to queries from faculty and staff. An instructor of pre-collegiate English might use the system to find out if different student groups—by race or age—are particularly at risk in a key sequence of courses in which he or she is teaching. The department might use the system to see how changes in teaching and curriculum are reflected, or not, in patterns of student success over time. (Is this where the quote begins? I can’t tell from this paragraph) Importantly, we heard from CCSF institutional research staff about the need to work directly with faculty—one-on-one, in small groups, and by departments—to help them envision ways to use the information; the promise, that is, lies not only in supplying good information but in cultivating a demand for it”.3

SO, how do we do this? The answer is, through carefully using Formative and Summative Assessments. But what the heck is Formative Assessment?

Formative Assessment is a kind of evaluation that is created to help students to improve performance. It has low stakes with regards to grading, but it allows students to practice, rehearse or apply the things most valuable to attaining the outcomes of the course. Often quizzes and homework represent this type of assessment. This assessment is most important in its role as a diagnostic tool which allows you to 1) identify areas of deficiency 2) prescribe alternative learning strategies 3) motivate the student to a deeper learning experience.

3Hutchings, P and Shulman, L.S. (2007) Perspectives. Learning about Student Learning from Community Colleges. The Carnegie Foundationfor the Advancement of Teaching retrieved Feb 16, 2008 at http://www.carnegiefoundation.org/perspectives/sub.asp?key=245&subkey=1096

Page 8: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 8

Summative Assessment, on the other hand, provides a final opportunity for students to show you what they are able to do with what they’ve learned. Summative assessment data can be used as a concluding judgment regarding grades and your last evaluation of the pedagogy and content in your course. It’s high stakes and scheduled at a time when students have had opportunity for feedback and improvement. The key to making summative assessment work is that it needs to be both fair and authentic. “Authentic Assessment” by Wiggins in the appendix provides more details.

“Post secondary assessment done right must be rooted in the course and in the classroom, in the individual cells, to speak metaphorically, where the metabolism

of learning actually takes place” (Wright, 1999).

The second step to improving your work through assessment is to determine the kind of data that will inform your teaching in the most effective ways. Data is a frightening word to many builders and faculty, but here are four important concepts about data that will help you to grab this hot two by four by the end: Direct versus Indirect data and Qualitative versus Quantitative data.

Direct versus Indirect

We often refer to Direct and Indirect data. Direct assessments evaluate what students can actually do. It is something you can witness with your own eyes: in class, through papers and exams, speeches or presentations. The setting for those assessment activities is usually confined and structured.

Indirect assessments don’t get at what students can actually do but ask for opinions about it, either from students themselves or from others who might be able to judge. These assessment activities are often in the form of surveys or self-assessments. When used with students, they tend to focus on the learning process or environment, but the actual learning itself is inferred. The setting for these assessments can be the classroom, but may occur elsewhere so it’s not easily contained or

structured. Confused? Try taking this quiz to help you get a deeper understanding of the terms. Evaluate the sources of data below. Select whether they provide direct data or indirect data concerning the issue at hand.

1. Polling information on who people will vote for in an election. a. direct data b. indirect data

Page 9: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 9

2. The actual vote count reported the evening after the national election. a. direct data b. indirect data

3. People’s opinion about their favorite make of car. a. direct data b. indirect data

4. The number and make of automobiles actually sold. a. direct data b. indirect data

5. Student learning assessed by essays graded by a rubric. a. direct data b. indirect data

6. Students’ opinions about their writing ability. a. direct data b. indirect data

7. A student satisfaction survey on the difficulty of science classes. a. direct data b. indirect data

8. Data on student success in science classes. a. direct data b. indirect data

See Appendix 1 for answers to Quiz on Direct and Indirect Data.

Direct data will indicate the areas of deficiency. In response to this, you need to review your student’s pre-requisite knowledge, study skills, your own pedagogy, the methods of assessment used and a variety of other issues related to the teaching and learning process. In contrast, indirect data provides valuable information on perceptions, which are the reality in that person’s mind. Responding to indirect data may mean clarifying expectations, changing the way you present things, helping others to see the criteria more clearly, or providing data that changes those perceptions. For example, indirect data from a survey of science and engineering students revealed that the majority of students felt if they joined a study group it was an admission of inadequacy and an indicator that they would not “make the grade.” Direct data showed that students involved in study groups had better grades and documented improvement, so these perceptions were wrong. Faculty had to respond to this data by working with student perceptions.

Page 10: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 10

Qualitative versus Quantitative Whether the data is direct or indirect, it may be information collected as numbers (Quantitative) or in another format that does not involve numbers (Qualitative). Qualitative data is collected as descriptive information, such as in a narrative or portfolio. These types of data, often found in open-ended questions, feedback surveys, or summary reports, are more difficult to compare, reproduce, and generalize. This kind of data is also bulky to store and to report; however, it is often the most valuable and insightful, often providing potential solutions or modifications in the form of feedback. Its companion, Quantitative data is data collected as numerical or statistical values. These data use actual numbers (scores, rates, etc.) to express quantities of a variable. Qualitative data, such as opinions, can be displayed as numerical data by using Likert scaled responses which assigns a numerical value to each response (e.g. 5 = strongly agree to 1 = strongly disagree). This data is easy to store and manage; it can be generalized and reproduced, but has limited value due to the rigidity of the responses and must be carefully constructed to be valid. Many people possess fears that the only data allowed for assessment results is quantitative, but this is not so.

Try another Quiz 1. A faculty member is convinced that field trips are the most effective way to teach geology but it is impacting the budget. Which data would be most convincing in a budget discussion?

a. A narrative on the benefits of field trips (qualitative data)

b. A collection of student opinions about field trips (indirect data could be qualitative or quantitative)

c. An example of student grades related to topics covered on the field trip that compares the scores of students who went on the field trip and those who did not (direct, quantitative)

d. A list indicating the number of the other institutions and geology programs that support geology field trips as an integral part of the pedagogy (indirect, quantitative)

e. A combination of these data

2. An ESL instructor has discovered from feedback from her students that the most important outcome they are hoping for is proper pronunciation when they speak. Which would be the most useful type of assessment data both for the individual student and for the course outcomes as a whole?

a. Direct statistical data gleaned from a multiple choice test about the fundamental rules in proper pronunciation (quantitative).

b. A national standardized ESL test (quantitative).

c. A student log book created as a result of listening and analyzing recordings of their own speaking (qualitative).

d. An interview developed to assess pronunciation and evaluated using a rubric that indicates the major types of errors and a narrative summary of the overall pronunciation expertise (could be qualitative and quantitative).

Page 11: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 11

e. A classroom speech evaluated by comments from fellow classmates (qualitative).

3. For the annual program review update in the mathematics department the faculty discussed the results of a word problem assessment embedded in the final exam of each section of each class. The assessment was graded with a rubric that faculty had thoroughly discussed in order to norm or make their judgments consistent. What kind of data would be most useful in the departmental annual report using this assessment?

a. Individual scores of every single student (direct, quantitative data).

b. Aggregated (combined data) for all the students in each type of course (direct, quantitative data).

c. A narrative report about what the department’s learned after analyzing the results (qualitative data).

d. A look at the scores achieved by different student populations (quantitative data).

e. The average score of each faculty member’s class section (quantitative data).

4. Reading faculty hypothesize that students are less likely to sign up for reading classes in the academic development department than they are to sign into linked reading courses attached to General Education courses (learning communities) where the texts in the course serve as the reading text. What type of data would help explore the validity of this hypothesis?

a. A survey asking students whether they would prefer to sign up for a learning community or a reading class (indirect data could be qualitative or quantitative).

b. Database information showing the number of students with placement test scores below collegiate level (quantitative data).

c. A comparison of student grades in General Education courses and their reading placement test results (quantitative data).

d. A narrative created from focus groups of students discussing reading problems (qualitative data).

e. An analysis of the number of students at another institution that has both linked reading classes and Academic Development reading classes.

5. A faculty member is trying to improve her classroom presentation methods. Which would provide the best feedback?

a. A questionnaire with (Likert-scaled) student responses providing options to select a. this method is helpful, b. this method is not helpful or c. this method confuses me. (indirect, quantitative data)

b. A demonstration of a method followed by open-ended discussion and feedback from the students. (indirect, qualitative data)

c. A self evaluation by the faculty member listing the pros and cons of each method (indirect, qualitative)

d. A review of the student test results directly related to that presentation method (direct, quantitative)

e. A combination of the above

Page 12: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 12

See Appendix 2 for answers to Quiz #2.

Developing or using these assessment techniques does not have to be burdensome. There are many really useful techniques for diverse disciplines available online and in highly recognized resources. One of these sources that has revolutionized teaching and learning is Classroom Assessment Techniques by Angelo and Cross. Classroom Assessment Techniques (CATs) are a group of well known and often used formative assessments. CATs are “simple tools for collecting data on student learning in order to improve it” (Angelo & Cross, 1993, p. 26). CATs are short, flexible, classroom techniques that provide rapid, informative feedback to improve classroom dynamics by monitoring learning, from the student’s perspective, throughout the semester. Faculty can use a CATS technique at any point during a class (beginning, middle or end of a class session). After evaluating the result, you can know what students are getting and can quickly change your teaching plans to fill in gaps or clarify misunderstandings. CATS work best when student responses are anonymous and the results are shared with them at the next class session. Some popular examples are “The Minute Paper,” “One Sentence Summary,” “Chain Notes,” and “Application Cards” (see chart below from the National Teaching and Learning Forum). You can find more examples and a more detailed explanation of how to use them in Classroom Assessment Techniques: A Handbook for College Teachers by Angelo & Cross (1993). For other discipline specific downloadable CATS see http://www.flaguide.org/cat/cat.php

Name: Description: What to do with the data: Time required:

Minute paper [2]

During the last few minutes of the class period, ask students to answer on a half-sheet of paper: "What is the most important point you learned today?" and, "What point remains least clear to you?” The purpose is to elicit data about students' comprehension of a particular class session.

Review responses and note any useful comments. During the next class periods emphasize the issues illuminated by your students' comments.

Prep: Low In class: Low Analysis: Low

Chain Notes

Students pass around an envelope on which the teacher has written one question about the class. When the envelope reaches a student he/she spends a moment to respond to the question and then places the response in the envelope.

Go through the student responses and determine the best criteria for categorizing the data with the goal of detecting response patterns. Discussing the patterns of responses with students can lead to better teaching and learning.

Prep: Low In class: Low Analysis: Low

Page 13: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 13

Memory matrix

Students fill in cells of a two-dimensional diagram for which instructor has provided labels. For example, in a music course, labels might be periods (Baroque, Classical) by countries (Germany, France, Britain); students enter composers in cells to demonstrate their ability to remember and classify key concepts.

Tally the numbers of correct and incorrect responses in each cell. Analyze differences both between and among the cells. Look for patterns among the incorrect responses and decide what might be the cause(s).

Prep: Med In class: Med Analysis: Med

Directed paraphrasing

Ask students to write a layman’s "translation" of something they have just learned -- geared to a specified individual or audience -- to assess their ability to comprehend and transfer concepts.

Categorize student responses according to characteristics you feel are important. Analyze the responses both within and across categories, noting ways you could address student needs.

Prep: Low In class: Med Analysis: Med

One-sentence summary

Students summarize knowledge of a topic by constructing a single sentence that answers the questions "Who does what to whom, when, where, how, and why?" The purpose is to require students to select only the defining features of an idea.

Evaluate the quality of each summary quickly and holistically. Note whether students have identified the essential concepts of the class topic and their interrelationships. Share your observations with your students.

Prep: Low In class: Med Analysis: Med

Exam Evaluations

Select a type of test that you are likely to give more than once or that has a significant impact on student performance. Create a few questions that evaluate the quality of the test. Add these questions to the exam or administer a separate, follow-up evaluation.

Try to distinguish student comments that address the fairness of your grading from those that address the fairness of the test as an assessment instrument. Respond to the general ideas represented by student comments.

Prep: Low In class: Low Analysis: Med

Application cards

After teaching about an important theory, principle, or procedure, ask students to write down at least one real-world

Quickly read once through the applications and categorize them according to their quality. Pick out a broad range of

Prep: Low In class: Low

Page 14: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 14

application for what they have just learned to determine how well they can transfer their learning.

examples and present them to the class.

Analysis: Med

Student- generated

test questions

Allow students to write test questions and model answers for specified topics, in a format consistent with course exams. This will give students the opportunity to evaluate the course topics, reflect on what they understand, and what good test items are.

Make a rough tally of the questions your students propose and the topics that they cover. Evaluate the questions and use the good ones as prompts for discussion. You may also want to revise the questions and use them on the upcoming exam.

Prep: Med In class: High Analysis: High

(may be homework)

[2] The Bureau of Evaluative Studies and Testing (BEST) can administer the Minute Paper electronically.

Applying Assessment Basics to Real Life Do you remember the Reading 961 SLOs from Mission College? Upon completion, students will:

1. Utilize vocabulary skills to comprehend assigned readings. 2. Determine and differentiate main ideas and supporting details in

assigned readings. 3. Make appropriate inferences in assigned readings

Let’s look at the first outcome: Utilize vocabulary skills to comprehend assigned readings. This is a skill. How would you assess a student’s ability to use vocabulary building strategies? You might:

Give them some reading with new vocabulary and document how they deal with the new vocabulary (a direct measure)

You could have them explain to others how they deal with new vocabulary (an indirect measure)

Page 15: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 15

Use the space below to jot down your thoughts about pros and cons of each method and what might work best for you or for other faculty at your school (Hint: Think about whether or not an indirect measure will give you enough information about how well a student can perform a skill).

Let’s look at the second and third outcomes: Determine and differentiate main ideas and supporting details in assigned readings and Make appropriate inferences in assigned readings. In these, cognitive abilities are being assessed, looking at what students actually know. There are many different ways to assess these outcomes. Here are some possibilities. In response to an assigned reading, students could:

Identify the main idea, supporting details and inferences in three selected paragraphs in a

text (a direct measure).

Answer ten questions about the main ideas, supporting details and inferences in the article (a

direct measure).

Write about them in a reading log (a direct measure).

Map them (a direct measure).

Debate them, using supporting details from the text (a direct measure).

Page 16: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 16

Have the student describe his or her process of coming to understand the article (an indirect

measure).

Again, use the space below to note the pluses and minuses of each assessment method and discuss

what might work in a reading class at your college.

Finally, let’s look at an outcome from another reading course from Mission College, one that is designed for students in the level below the one we’ve been discussing (i.e. three levels below college level English). One of the outcomes for Reading 960 reads: Perceive themselves as growing in reading competence. This one targets a student’s belief or values. Many basic skills classes seek to explore students’ self-perceptions. This can be tricky to assess. You have to use indirect measures, because this one is about the student’s perception of his or her abilities. You could have students:

Describe to someone else in the class how competent they feel about reading.

Complete a survey about their reading abilities and competence at the end of the class.

Write an essay about how they have changed as a reader over the course of the class.

Survey their attitudes about their competency in reading at the beginning and end of the

course, comparing the results.

Page 17: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 17

Use the space below to note the merits of each method. Which ones seem the most possible to do, taking the least amount of time for you or your colleagues at your school?

Speaking of time, we do expect you to be worrying about it at this point. “Wait a minute,” we hear you say, “Are you telling me that I have to assess every outcome for my class in addition to what I regularly do? I‘m too busy! I have a lot of assignments to grade, not to mention an entire semester’s worth of material to cover. What do you want from me?”

But stop and think for a moment. Would you ever construct a building without examining it before you let someone move in? Surely you would want to walk through each room and scrutinize it carefully, perhaps testing the soundness of the walls or floor. In the same way, you cannot afford to cover the content of your course without assessing what the students have learned. Even if you did manage to cover everything, meaning a job well done by you, it’s possible that you could discover that your students are not able to DO anything with that material. Assessment is the building inspection!

The good news is that we have an effective method that draws on the major assignments you’re already using in your class. This is called course embedded assessment. Using this method, you can choose an assignment that you have to grade anyway as an assessment source. Note that the many suggestions for assessments listed above are all class assignments. Though you will grade that assignment like you usually do, you’ll also be analyzing the results from a slightly different perspective. What does the performance of

your students on this particular assignment tell you about their issues and needs with the material? What can you discover about what they’re learning?

Page 18: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 18

“Okay,” you say, “so why can’t I just use my grades as an assessment?” Good question! But think about our purpose here. A letter grade is a summation of student performance and doesn’t provide you with information about the different parts of the assignment, so you can see where students are grasping the material and where they might be lost. Remember that assessment asks you to formalize what you probably do intuitively each time you grade a set of assignments -- analyzing what you think your students learned and planning for what you need to change to help them improve. The way to grade an assignment so that it can authenticate or confirm that intuitive information is to use a rubric or primary trait scale. If you are unfamiliar with these terms, please see the appendix of Chapter 14 for a detailed description of what rubrics are and how to create them. Not every assignment you give to a class can do double duty as an assessment measure. Some quizzes or homework simply test lower level knowledge, the type that is listed in the course objectives. Assignments that you can also use as an assessment must measure higher level thinking skills related to the course outcomes. SO examine your assignments and look for the ones that ask students to synthesize and think critically, the ones that target the outcomes for your specific course.

Closing the Loop

The final step in the assessment of student learning outcomes is often called “Closing the Loop.” The term refers to taking the time to look carefully at the data you have collected and analyzing what it suggests you can do to improve teaching and learning. Writing SLOs and assessing them is no good if the final reflective step isn’t completed. The good news is that it can be both the most rewarding and most enjoyable part of the assessment process, especially if it results in dialogue with other colleagues about what is going on in your classrooms or department. Some community colleges, ones that have embarked on assessment processes that ask colleagues to share results, report that meetings have become more meaningful. They are actually talking about teaching instead of budget cuts, college business or even parking!

The chart below outlines the process. Note how circular it is. It keeps going, each step feeding into the next. Ideally, assessment never ends. It simply continues in the same way that you continue to informally evaluate what happens in your classes.

Page 19: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 19

Many colleges are tying the assessment loop to Program Review. It makes great sense to evaluate assessment results and then plan how to improve teaching and learning based on those results. At some colleges across the state, assessment results are now being used to justify budgetary requests. Not only can assessment let you know what is going on in your classes, but it can also provide you with funds to continue your good work.

Let’s take a look at how a department can close the loop on assessing a basic skills class. Mission College used this process to assess Reading 961, the course whose SLOs you looked at in the beginning of this chapter. This is only one of many methods a department can choose to close the loop. This is simply an example. Here are the steps they took: Step One: They wrote SLOs for the class:

1. Utilize vocabulary skills to comprehend assigned readings. 2. Determine and differentiate main ideas and supporting details in

assigned readings. Step Two: They decided on an assessment method to measure SLO #2. “Students will be given an assignment asking them to read a selection that can be evaluated in terms of one of the outcomes above. This assignment will then be evaluated by the classroom instructor using a rubric designed for this purpose.” Note that the department chose to use a course-embedded assessment method, using a rubric to “grade” the assignment. All of the faculty teaching the course (both full time and adjunct) participated. The assignments varied; some instructors asked students to answer ten questions about an assigned text while others asked them to write the main point in one

Develop, modify, or review a curriculum, course, program, or service.

Develop or modify Student Learning Outcomes (SLOs)

Design & Measure Student Learning as a result of the Curriculum, Course, or Program

Collect, discuss, and analyze data.

Determine refinements based on outcomes data.

Closing the Assessment Loop

Page 20: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 20

sentence and then list supporting details. Still another other asked students to map the reading and then either write or ask questions about it. Step Three: Each individual instructor analyzed his or her results and recorded it on an Individual Faculty Results Form that the department had created for this purpose. The faculty from Mission have graciously given us permission to share some of their forms which are at the end of this chapter. You may find their analysis of the student work interesting. Step Four: They decided on a process to close the loop where they would meet to share their individual results, and discuss how to improve the course. “The department will then set priorities and create an instructional plan with a timeline. The results/analysis, recommendations, instructional plan and timeline will then be recorded on the Department Analysis Form.” This is also included at the back of this chapter.

Step Five: They put their plans to the test. They discovered four things from doing this assessment process together.

1. “…The need for repetition and explicit instruction for learning to take place.” 2. “ …Absences affect student learning, and students who had excessive

absences, and/or the occurrence of two holidays in November prior to giving the assessment may have affected the outcomes.”

3. “ …Students seem to do well on prescriptive work, but have difficulty creating structure and meaning for themselves.”

4. “Instructors were interested to note that they came to the same conclusions regarding teaching and learning based on 4 different assessments, rubrics and students. Results were remarkably consistent.”

Step Six: They made plans for improvement.

1. “Need to rewrite SLO to separate Main Idea and Supporting Details from Inferences. Cannot measure both in one SLO.”

2. “Make the SLOs explicit so students know their performance expectations and provide ample practice.”

3. “Recognize the need to repeat the Main Idea SLO in various ways: outlining, mapping and summarizing, throughout the semester.”

Note that the assessment revealed the need for a third SLO, which was added to the course and included in the list you worked with earlier in the chapter. Revising and improving your own SLOs or assessment methods are key to a successful process. Assessment and SLOs continue to evolve. Going through an entire assessment cycle, where you close the loop, will give you the information you need to make the process better and more workable for your department.

Page 21: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 21

A Few Words of Final Advice

1. Keep it simple. Don’t create a process that is so cumbersome and difficult that faculty won’t want to do it. Create something that doesn’t take a lot of time.

2. Keep it safe. SLO Assessment is not to be used to evaluate individual faculty.

3. Focus on just one or two SLOs at a time. You need not assess EVERY SLO all the time.

4. Start small. You don’t have to do everything at once. Complete an entire loop with one course or a series of classes and see what it teaches you. Just do it.

5. The creation of assessment methods and its analysis is a faculty responsibility. Don’t give it over to people who are not in your classroom.

6. Complete an assessment cycle. Use the completed cycle to improve the way you approach assessing other outcomes.

7. Make it fun.

8. Keep it sustainable – don’t create something that cannot be continued.

Page 22: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 22

Appendix Chapter 15 Course Assessment Basics: Evaluating Your Construction

Appendix 1: Answers to Quiz on Direct and Indirect Data

Appendix 2: Answers to Quiz on Qualitative and Quantitative Data

Appendix 3: Mission College Reading Department Assessment Report

on Reading 961

Departmental Assessment Results Form

Individual Instructor report forms, assignments and rubrics

Appendix 4: Long Beach City College Materials for Determining Evaluation Techniques

Appendix 5: Choosing the Right Assessment Tool

Appendix 6: Resources for Chapter 15

Page 23: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 23

Appendix 1

Answers to Quiz on Direct and Indirect Data

1. Polling information on who people will vote for in an election. a. indirect data

2. The actual vote count reported the evening after the national election.

� direct data 3. People’s opinion about their favorite make of car.

� indirect data

4. The number and make of automobiles actually sold. � direct data

5. Student learning assessed by essays graded by a rubric.

� direct data 6. Students’ opinions about their writing ability.

� indirect data 7. A student satisfaction survey on the difficulty of science classes.

� indirect data 8. Data on student success in science classes.

� direct data

Page 24: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 24

Appendix 2 Quiz on Quantitative and Qualitative Data

1. A faculty member is convinced that field trips are the most effective way to teach geology but it is impacting the budget. Which data would be most convincing in a budget discussion.

a. A narrative on the benefits of field trips(qualitative data)

b. A collection of student opinions about field trips (indirect data could be qualitative or quantitative)

c. An example of student grades related to topics covered on the field trip that compares the scores of students who went on the field trip and those who did not (direct, quantitative)

d. A list indicating the number of the other institutions and geology programs that support geology field trips as an integral part of the pedagogy (indirect, quantitative)

e. A combination of these data

2. An ESL instructor has discovered from feedback from her students that the most important outcome they are hoping for is proper pronunciation when they speak. Which would be the most useful type of assessment data both for the individual student and for the course outcomes as a whole?

a. Direct statistical data gleaned from a multiple choice test about the fundamental rules in proper pronunciation (quantitative).

b. A national standardized ESL test (quantitative).

c. A student log book created as a result of listening and analyzing recordings of their own speaking (qualitative).

d. An interview developed to assess pronunciation and evaluated using a rubric that indicates the major types of errors and a narrative summary of the overall pronunciation expertise (could be qualitative and quantitative).

e. A classroom speech evaluated by comments from fellow classmates (qualitative).

3. For the annual program review update in the mathematics department the faculty discussed the results of a word problem assessment embedded in the final exam of each section of each class. The assessment was graded with a rubric that faculty had thoroughly discussed in order to norm or make their judgments consistent. What kind of data would be most useful in the departmental annual report using this assessment?

a. Individual scores of every single student (direct, quantitative data).

b. Aggregated (combined data) for all the students in each type of course (direct, quantitative data).

c. A narrative report about what the department’s learned after analyzing the results (qualitative data).

d. A look at the scores achieved by different student populations (quantitative data).

e. The average score of each faculty member’s class section (quantitative data).

Page 25: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 25

4. Reading faculty hypothesize that students are less likely to sign up for reading classes in the academic development department than they are to sign into linked reading courses attached to General Education courses (learning communities) where the texts in the course serve as the reading text. What type of data would help explore the validity of this hypothesis?

a. A survey asking students whether they would prefer to sign up for a learning community or a reading class (indirect data could be qualitative or quantitative).

b. Database information showing the number of students with placement test scores below collegiate level (quantitative data).

c. A comparison of student grades in General Education courses and their reading placement test results (quantitative data).

d. A narrative created from focus groups of students discussing reading problems (qualitative data).

e. An analysis of the number of students at another institution that has both linked reading classes and Academic Development reading classes.

5. A faculty member is trying to improve her classroom presentation methods. Which would provide the best feedback?

a. A questionnaire with (Likert-scaled) student responses providing options to select a. this method is helpful, b. this method is not helpful or c. this method confuses me. (indirect, quantitative data)

b. A demonstration of a method followed by open-ended discussion and feedback from the students. (indirect, qualitative data)

c. A self evaluation by the faculty member listing the pros and cons of each method (indirect, qualitative)

d. A review of the student test results directly related to that presentation method (direct, quantitative)

e. A combination of the above

Page 26: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 26

Appendix 3 Reading 961 Student Learning Outcomes Assessment

Results

Fall 2007

Participants:

Ina Gard Aaron Malchow Alice Marciel Dianne McKay

Submitted to Title V Project January 7, 2008

Process Summary Four members of the reading department participated in a study to validate one of the Student learning outcomes for Reading 961 the mid level developmental reading course at Mission College. The catalog description for this course is:

Reading 961: Effective Reading (Non-Associate Degree Course) 3.0 units Prerequisites: READ 960, or ESL 970RW, ESL 970G and ESL 970LS, or qualifying score on the placement test.

This developmental course is designed for students who wish to correct or improve basic reading habits and skills including: expanding vocabulary, improving comprehension and attaining an efficient reading rate. The content and objectives of this course will vary somewhat to meet the student’s individual needs. Some study skills may be included. May be repeated once for credit. Credit/No Credit Option.

The department’s assessment plan had several components. The participating faculty met to discuss the project and the course SLOs, did individual work in preparation for additional discussion on the assessment tools and rubrics, conducted their assessment and evaluated the results. The plan follows:

Page 27: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 27

Reading Department Assessment Plan: Reading 961 Outcomes

Upon completion of Reading 961 the student will: 1. Utilize vocabulary skills to comprehend assigned readings. 2. Articulate main ideas and make inferences in assigned readings.

Objectives 1. Apply knowledge of vocabulary commonly used in college reading, writing, and speaking.

2. Identify main idea in assigned readings. 3. Identify supporting details in assigned readings. 4. Identify organizational patterns and relationships of ideas in assigned readings.

5. Utilize graphic organizers (mapping, outlining, summarizing) as a method of organizing ideas in prose reading.

6. Apply contextual clues as a method of improving comprehension through informing vocabulary in assigned readings.

7. Apply critical thinking skills including distinguishing fact from opinion, making inferences, and identifying author’s purpose and tone in assigned readings.

8. Apply reading and study techniques to enhance comprehension of college textbooks.

Assessment

Process

Assessment

Evaluation

Students will be given an assignment asking them to read a selection that

can be evaluated in terms of one of the outcomes above. This assignment

will then be evaluated by the classroom instructor using a rubric designed

for this purpose. The results (with any analysis) will be recorded on the

Individual Faculty Results Form and presented to the department, with

recommendations, and a discussion will follow with the goal of generating

additional recommendations for the course and for the department. The

department will then set priorities and create an instructional plan with a

timeline. The results/analysis, recommendations, instructional plan and

timeline will then be recorded on the Department Analysis Form.

The discussion will take place once or twice a year, possibly on Flex Day.

The whole department will be invited, and part–time faculty will be paid to

attend with money from the Title V grant (for a limited time).

Page 28: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 28

Following is the timeline for training and implementing the plan during Fall 07 semester:

Training (September 25): 1.5 Hours

Selection/Refinement of Assessment Tool and Rubric (On Your Own): 4

Hours

Training (October 16, Workshopping Assessment Tools): 1.5 Hours

Training (November 13, Workshopping Rubrics): 1.5 Hours

Preparing Results/Filling out Forms (On Your Own): 4 Hours

Training (December 4, Reporting out Results/Brainstorming Session): 2

Hours

The results of the SLO validation follow in summary form, along with the individual instructor’s rubrics, assignments and rating sheets. One of the immediate results of the study was the need to rewrite the current SLO to separate the measurement of main ideas and supporting details from inferences. We found it too difficult to measure this combination. As a result, the rewritten SLOs for Reading 961 are:

Upon completion of Reading 961 the student will: 1. Utilize vocabulary skills to comprehend assigned readings. 2. Determine and differentiate main ideas and supporting details in assigned readings. 3. Make appropriate inferences in assigned readings.

Student Learning Outcomes Assessment Summary

The individual instructors created their own assignments and rubrics, implemented their assessment and evaluated the results. These results and rubrics can be viewed in the attached appendix. As a result of these assessments and discussion among the faculty, the following summary analysis and plan was created.

Page 29: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 29

Reading Department Analysis

Reading 961 SLO Validation

Fall 2007

Course Reading 961

Meeting Date December 4, 2007

Number of Faculty/Staff in

Attendance

Four: Aaron Malchow, Ina Gard, Alice Marciel, Dianne

McKay

Number of Faculty/Staff

Sharing Assessment Results

Same

SLOs Measured 2. Articulate main ideas and make inferences in assigned

readings

Assessment Tools Attached

Assessment Results

What student needs and

issues were revealed?

Found the need for repetition and explicit instruction for

learning to take place.

Found that absences affect student learning, and students who

had excessive absences, and/or the occurrence of two holidays

in November prior to giving the assessment may have affected

the outcomes.

Found that students seem to do well on prescriptive work, but

have difficulty creating structure and meaning for themselves.

Instructors were interested to note that they came to the same

conclusions regarding teaching and learning based on 4

different assessments, rubrics and students. Results were

remarkably consistent.

Next Steps to Improve

Student Learning

How might student

performance be

improved?

Stressing explicit instruction and repetition of major themes

within the SLOs to improve student performance. The goal

will be to make what the student needs to learn and

demonstrate very clear and give the students ample

opportunity to practice and perfect such learning.

Page 30: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 30

Next Step in the

Department to

Improve Student

Learning

We will focus on the SLOs all semester as the major

focus. This serves to give structure to the teaching and

learning.

It is clear that this SLO needs to be rewritten into two

separate SLOs, one for main idea and one for

inferences. The two cannot be measured together.

Priorities to Improve

Student Learning

List the top 3-6 things

faculty/staff felt would

most improve student

learning

Need to rewrite SLO to separate Main Idea and

Supporting Details from Inferences. Cannot measure

both in one SLO.

Make the SLOs explicit so students know their

performance expectations and provide ample practice.

Recognize the need to repeat the Main Idea SLO in

various ways: outlining, mapping and summarizing,

throughout the semester.

Implementation

List the departmental

plans to implement

these priorities

By January 2008, rewrite the SLOs to separate Main

Idea and supporting details from inference.

In Spring 08, teachers will implement changes to

teaching and we will reassess in Fall 08.

By Spring 08 create SLO overlays to our official course

outlines to give to instructors to help focus their

instruction around those SLOs

In Spring 08 Flex Days, share the results of the SLO

validation with all department instructors and discuss

how the SLO’s can help shape instruction and focus on

the need for repetition of practice.

In Fall 08, we will select one assessment and rubric and

give it to all READ 961 students and evaluate that way.

Timeline for

Implementation

(Make a timeline for

implementation of the

top priorities)

See above.

Page 31: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 31

Individual Faculty Assessment Results Form: Aaron Malchow

You can generalize your results or use numbers instead of grades (A= 4 points etc). Remember that this assessment process can’t be used to evaluate you personally or specific students. The point is to evaluate how students are mastering the core competencies.

Department: Reading Department - Aaron Malchow

Course: Reading 961

SLO: Articulate main ideas and make inferences in assigned

readings.

Assessment Tool/

Assignment:

(Attach)

On a sheet of paper, for “Serving in Florida,” in Nickel

and Dimed, identify the main idea and one supporting

detail for:

1. the first full paragraph on page 12,

2. the last paragraph on page 27, and

3. the last paragraph on page 30

Rubric Used to Evaluate

Assessment tool:(Attach)

See attached page

Number of A grades: 0

Number of B grades: 11

Number of C grades: 5

Number of D grades: 2

Number of F grades: 0

Any factors that may have

affected the grades:

Applying the rubric itself.

Thoughts on the assessment

results (see page 1, 5c):

(You may attach or type into

expandable box.)

I looked the student responses to the assignment twice

– eyeballing them, following my standard grading

process, and then grading them using the rubric.

Comparing my standard process to the rubric results, I

would have marked at least 3-5 of the B-level responses

as A-level work, and the 2 D-level work would have

been F-level by my typical standards.

I find it interesting that I would not have

Page 32: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 32

viewed any work as D-level using my typical grading

process, but at this point in the semester, I would hope

that my students could accomplish this task. The 2

student responses that rated a D-level on the rubric were

incomplete, and showed a lack of effort. The rubric does

not have a rating that factors in incomplete work, as the

bottom most ranking implies that some effort was given,

rather than not finishing the assignment. Obviously, as a

teacher, I should not feel obligated to follow the rubric’s

guidelines under those circumstances, but in using the

rubric, I wanted to see what it accounted for – and did

not account for — rather than make exceptions for it

myself.

I also noticed that the rubric is actually

looking for two skills – not one – in each category. In

both categories, it implicitly identifies being able to

complete the task “without having the article in front of

him/her,” meaning that the rubric is also attempting to

measure recall, as well as comprehension. For the

assignment, I only wanted to measure comprehension of

main ideas and supporting details.

While I am not inclined to use this rubric

again, I do believe that using rubrics is useful in

articulating expectations – both to myself as a teacher

and to my students. I will use the experience to help me

better determine which rubrics might be most effective

for my classroom instruction.

Page 33: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 33

Rubric Used to Evaluate Assessment tool: Malchow

CATEGORY 4 3 2 1

Identifies important information

Student lists all the main points of the article without having the article in front of him/her.

The student lists all the main points, but uses the article for reference.

The student lists all but one of the main points, using the article for reference. S/he does not highlight any unimportant points.

The student cannot list important information with accuracy.

Identifies details Student recalls several details for each main point without referring to the article.

Student recalls several details for each main point, but needs to refer to the article, occasionally.

Student is able to locate most of the details when looking at the article.

Student cannot locate details with accuracy

Page 34: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 34

Individual Faculty Assessment Results Form: Ina Gard

You can generalize your results or use numbers instead of grades (A= 4 points etc). Remember that

this assessment process can’t be used to evaluate you personally or specific students. The point is to

evaluate how students are mastering the core competencies.

Department: Reading - Ina Gard

Course: Reading 961

SLO: 2. Articulate main ideas and make inferences in assigned

readings

Assessment Tool/

Assignment:(Attach)

Students read “Rowing the Bus”. When complete

students mapped the reading and answered 10 main idea

questions

Rubric Used to Evaluate

Assessment tool:

(Attach)

See Attached

Number of A grades: 2

Number of B grades: 3

Number of C grades: 4

Number of D grades: 2

Number of F grades: 4

Any factors that may have

affected the grades:

Holidays on Monday evening (when this class met)

means students are not in class for two weeks.

Thoughts on the assessment

results (see page 1, 5c):

(You may attach or type into

expandable box.)

Repetition seems even more important than I would

have believed. Continually reviewing main idea with

major points seems crucial. Having students map as

much as possible appears to help.

Level of success on this assessment tool seems to

correlate with other work done in the class.

Page 35: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 35

Individual Faculty Assessment Results Form: Alice Marciel

You can generalize your results or use numbers instead of grades (A= 4 points etc.). Remember that

this assessment process can’t be used to evaluate you personally or specific students. The point is to

evaluate how students are mastering the core competencies.

Department: Reading - Alice Marciel

Course: Reading 961

SLO: 2. Articulate main ideas and make inferences in

assigned readings

Assessment Tool/ Assignment:

(Attach)

Students read “Rowing the Bus” article and answered

10 main idea questions and mapped the story. 20

students were assessed.

Rubric Used to Evaluate

Assessment tool:(Attach)

See Attached

Number of A grades: 14 students - identified 8 responses

Number of B grades: 2 students - identified 7 responses

Number of C grades: 4 students - identified 4 or 5 responses

Number of D grades: none

Number of F grades: none

Any factors that may have

affected the grades:

Increased mapping assignments for practice prior to

administering the assessment.

Thoughts on the assessment

results (see page 1, 5c):

(You may attach or type into

expandable box.)

Page 36: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 36

Main Idea/Supporting Detail Rubric

Used by Alice Marciel and Ina Gard

Performance Level Criteria

5 - Superior Correctly identifies 8 of 10 responses. Demonstrates in-depth understanding of material

4 - Strong Correctly identifies 6 - 7 of 10 responses. Demonstrates good understanding of the material.

3 - Adequate Correctly identifies 4 or 5 of 10 responses. Demonstrates sufficient understanding of the material

2 - Limited Correctly identifies 3 of 10 responses. Demonstrates some understanding of material.

1 - Very Limited Correctly identifies fewer than 3 of 10 responses. Demonstrates lack of understanding of the material

Page 37: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 37

Individual Faculty Assessment Results Form: Dianne McKay

You can generalize your results or use numbers instead of grades (A= 4 points etc.). Remember that

this assessment process can’t be used to evaluate you personally or specific students. The point is to

evaluate how students are mastering the core competencies.

Department: Reading - Dianne McKay

Course: Reading 961

SLO: 2. Articulate main ideas and make inferences in assigned

readings

Assessment Tool/

Assignment:

(Attach)

Select one chapter from the week’s reading in Breaking

Through and in one good sentence, write its central point

(thesis), then list the details that support this central point.

You may use mapping or outlining to do this if it is helpful

Rubric Used to Evaluate

Assessment tool: (Attach)

See Attached

Number of A grades: 9

Number of B grades: 1

Number of C grades: 2

Number of D grades: 1

Number of F grades: 0

Any factors that may have

affected the grades:

1. I changed this assignment to add the supporting details

(map or outline) for this project. This by itself was an

improvement to the assignment and the teaching that went

with it.

2. The students had two previous opportunities to attempt

the main ideas and supporting details and improve them.

This was week 3 of a 4 week project. Students who

received a C & D had had excessive absences and so missed

the 2 weeks of learning experiences previous to this work.

Thoughts on the

assessment results (see

page 1, 5c):

(You may attach or type

into expandable box.)

1. Based on this experience, I would allow more time in

class to specifically peer review the main ideas and details.

In the first 2 weeks, I had taken samples of students work

and as a group we critiqued and improved on them. I also

took examples of good student work and used them as

templates for the students.

Page 38: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 38

Dianne McKay’s Scoring Rubric for

SLO Assessment of Main Idea

4 (A) Central Point (main idea) is clear and stated in a complete sentence. All supporting details are relevant to the Central Point.

3 (B) Central Point contains main idea but may be unclearly stated and/or not in a complete sentence. Most supporting details support the Central Point.

2 (C) Student attempted to write a Central Point, but it is unclear, and some of the supporting details don’t directly support the Central Point.

1 (D) The statement of Central Point is incorrect given the subject matter, and the details don’t support it.

0 (F) No statement of Central Point is written and no appropriate supporting details are represented.

OUTCOMES for Reading 960 (three levels below College Level English): Upon completion, students will:

1. Apply vocabulary-building strategies to improve their analysis of readings. 2. Demonstrate a literal comprehension of readings, through identification and analysis of main ideas, supporting details and rhetorical patterns of organization and development.

3. Perceive themselves as growing in reading competence.

OBJECTIVES for Reading 960 (three levels below College Level English): 1. Demonstrate sufficient vocabulary and language development to allow for reading and written expression at a pre-collegiate level.

2. Recognize the main idea of a paragraph in pre-collegiate level readings. 3. Recognize supporting details in paragraphs in pre-collegiate level readings. 4. Understand organizational patterns and relationships of ideas in pre-collegiate level readings. 5. Apply word attack skills including phonics, syllabication, and dictionary skills to read and spell words at a pre-collegiate level.

6. Recognize and apply written context to inform vocabulary knowledge at a pre-collegiate level.

Page 39: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 39

Appendix 4 Long Beach City College Materials for Determining Evaluation Techniques

Methods Of Evaluation: The faculty author will describe general evaluation methodologies (1-2 sentence explanations) as they align with the course's assignments, course content, and student learning outcomes. Course assignments should not be iterated on this page. The department's expectations, standards, or criteria of judgment must be included. Explain the criteria used by the instructor to evaluate the students' work, the nature of the student performance that is being expected as a venue to demonstrate the accomplishment of the learning outcomes and how these evaluations demonstrate that students have met the expected outcomes for this course.

A significant number of the assignments described on the Assignment Page should be evaluated and, thus, identified and explained on this page.

There are three sections to this page. 1. There is a written evaluation section with several prompts. 2. There is a problem-solving evaluation section with several prompts. 3. There is an "other evaluation" section with several prompts. Please identify the required prompt carefully. Multiple choice and true-false tests should be explained under the "objective exam" prompt.

Typically a laboratory class would evaluate skills, techniques, and performance. This information should be described under the "skill demonstration" prompt.

The evaluation of higher level critical thinking skills should be emphasized-see Bloom's Taxonomy or a comparable taxonomy.

Please use complete sentences when writing these responses.

A course grade may not be based solely on attendance.

Representative means of evaluation are illustrated below.

Written evaluation, such as:

Essay Exam(s): Students will write essay exam answers using anatomical terminology and reference terms of anatomical directions to compare and contrast the relationships of organs and organ systems.

Take home exams must contain accurate, clear, and coherent thesis statements supported by the appropriate number of paragraphs to sustain the student's argument.

Term or Other Paper(s):

The critique paper is evaluated on how well the student is able to justify his/her opinion of a dance concert through analysis of the choreography, performance, and theatrical elements by using detailed examples to support the thesis.

The "socialization report" is graded on the student's inclusion of the required materials, presentation methods, application to the reading, and depth of the self-reflective material.

A research term paper is graded based on quality of research, completeness and objectivity of data, reasoned conclusions that demonstrate critical thinking, clarity of organization, quality of written English, and timeliness.

Laboratory Report(s):

In written laboratory reports students must demonstrate the use of critical thinking skills to deduce the proper question, comply with the given instructions, maneuver in, through, and out of the website, synthesize key pieces of information, and compare this information to their own situation.

Page 40: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 40

Written Homework:

Article worksheets are evaluated on the inclusion of the required information as well as the student's synthesis of the concepts presented.

Assignments will be evaluated on the completion of the assignment in a timely manner, thorough and correct completion of the assignment based on the instructions given, and signs of effort in the completion of the assignment.

Reading Report(s):

The topics for weekly reading reports will be derived from the textbook, monographs, and/or journal articles and will evaluate the needed analytical skills for the student to develop written theses.

Computational or non-computational problem-solving demonstrations, such as:

Exam(s): Exams are evaluated on the student's ability to synthesize key concepts and solve appropriate problems as they relate to the content.

Quizzes: Several short quizzes are given during the semester and will evaluate the skills that the students have developed in utilizing the appropriate equations and diagrams to solve problems at the end of each chapter in the textbook.

Homework Problem(s):

Students will be given several assignments that present a problematic situation to be analyzed and resolved.

Laboratory Report(s):

Project/Lab Reports are evaluated based on completeness and the ability of the student to summarize results and draw conclusions.

Fieldwork: During an assigned field activity the student will be evaluated as to the demonstration of critical thinking and analysis to the problem being addressed, quality and effort displayed in the performance of the objective, and the new knowledge gained from the experience.

Further methods of evaluation, such as:

Skill demonstrations, such as: class performances(s), fieldwork, performance exam(s):

The student performance criteria are based on the accuracy of the movement shape and correct body alignment or technique, accuracy with the music, interpretation of the music, use of the appropriate energy dynamics, and the projection of confidence and stage presence without stopping.

Performance exams will be given to students periodically throughout the term to evaluate safety, technique, and procedures based on industry standards.

Objective examinations, such as: multiple choice, true/ false, matching items, completion:

Students will be given multiple-choice, true-false, matching and/or fill-in exam questions. Some questions will test detailed knowledge of the human body and others will require students to think at higher cognitive levels to assess the relationship of tissues, organs, and systems and to correlate these to functions in the body as a whole.

Objective exams will be used to evaluate a student's recall of key concepts and correct use of vocabulary/nomenclature.

Portfolio: The student's portfolio of finished projects will be evaluated on technical (basic use of software) as well as artistic (basic use of color and composition) merit.

Oral Presentation(s): Term end projects will be graded on the appropriate collection of data, accurate summary of results, and clarity of the presentation to the instructor and class. Students will be evaluated as to their ability to clearly discuss their work product in terms of aesthetics, composition, and technique.

Other (specify):

Page 41: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 41

Appendix 5 Choosing the Right Assessment Tool Choosing the Right Assessment Tools

Assessment Tool Pros Cons

Multiple Choice Exam

� easy to grade � objective

� reduces assessment to multiple choice answers

Licensing Exams

� easy to score and compare � no authentic testing, may outdate

Standardized Cognitive Tests

� comparable between students

Checklists

� very useful for skills or performances

� students know exactly what is missing

� can minimize large picture and interrelatedness

� evaluation feedback is basically a yes/no - present/absent - without detail

Essay

� displays analytical and synthetic thinking well

� time consuming to grade, can be subjective

Case Study

� displays analytical and synthetic thinking well connects other knowledge to topic

� creating the case is time consuming, dependent on student knowledge form multiple areas

Problem Solving

� displays analytical and synthetic thinking well authentic if real world situations are used

� difficult to grade due to multiple methods and potential multiple solutions

Oral Speech

� easily graded with rubric allows other students to see and learn what each student learned

� connects general education goals with discipline-specific courses

� difficult for ESL students stressful for students takes course time

� must fairly grade course content beyond delivery

Debate

� provides immediate feedback to the student

� reveals thinking and ability to respond based on background knowledge and critical thinking ability

� requires good rubric more than one evaluator is helpful difficult for ESL students stressful for students takes course time

Product Creation & Special Reports

� students can display skills. knowledge, and abilities in a way that is suited to them

� must have clearly defined criteria and evaluative measures "the look" cannot over-ride the content

Page 42: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 42

Assessment Tool Pros Cons

Flowchart or Diagram

� displays original synthetic thinking on the part of the student

� perhaps the best way to display overall high level thinking and articulation abilities

� more difficult to grade, requiring a checklist or rubric for a variety of different answers

� difficult for some students to do on the spot

Portfolios

� provides the students with a clear record of their work and growth

� best evidence of growth and change over time

� students can display skills. knowledge, and abilities in a way that is suited to them promotes self-assessment

� time consuming to grade different content in portfolio makes evaluating difficult and may require training bulky to manage depending on size

Exit Surveys

� provides good summative data easy to manage data if Likert-scaled responses are used

� Likert scales limit feedback, open-ended responses are bulky to manage,

Performance

� provides best display of skills and abilities

� provides excellent opportunity for peer review

� students can display skills. knowledge, and abilities in a way that is suited to them

� stressful for students may take course time some students may take the evaluation very hard - evaluative statements must be carefully framed

Capstone project or course

� best method to measure growth overtime with regards to a course or program - cumulative

� focus and breadth of assessment are important

� understanding all the variables to produce assessment results is also important

� may result in additional course requirements

� requires coordination and agreement on standards

Team Project

� connects general education goals with discipline-specific courses

� must fairly grade individuals as well as team

� grading is slightly more complicated

� student interaction may be a challenge

Reflective self- assessment essay

� provides invaluable ability to evaluate affective growth in students

� must use evidence to support conclusions, not just self-opinionated assessment

Satisfaction and Perception Surveys

� provides good indirect data data can be compared longitudinally

� can determine outcomes over a long period of time & variables

� respondents may be influenced by factors other than those being considered

� watch validity and reliability

Page 43: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 43

Appendix 6 Resources for Chapter 15

Cosumnes River College has an excellent short power point detailing the process of developing ESL

SLOs at http://research.crc.losrios.edu/Marchand%20SLO%20Presentation.ppt#1

Fulks, J. (2004) Assessing Student Learning in Higher Education.

http://online.bakersfieldcollege.edu/courseassessment/Default.htm

Hutchings, P and Shulman, L.S. (2007) Perspectives. Learning about Student Learning from

Community Colleges. The Carnegie Foundationfor the Advancement of Teaching retrieved Feb 16,

2008 at http://www.carnegiefoundation.org/perspectives/sub.asp?key=245&subkey=1096

Student Learning Outcomes and Instructional Planning: A Workbook” Cabrillo College, http://pro.cabrillo.edu/SLOs

Additional Helpful Resources include:

AAHE American Association for Higher Education. (1998). Nine Principles of Good Practice for Assessing Student Learning. American Association for Higher Education Assessment Forum. http://www.aahe.org/assessment/principl.htm

AAHE American Association for Higher Education. (1998). Assessment Forum. http://www.aahe.org/initiatives/assessment.htm Academic Senate of California Community Colleges ( ASCCC), 1993. Model District Policy for Pre-requisites, Co-requisites Advisories on Recommended Preparation and Other Limitations on Enrollment. From the ASCCC website http://www.academicsenate.cc.ca.us/Publications/Papers/Model_prerequisites.html ACCJC-WASC. Accrediting Commission for Community and Junior Colleges Western Association of Schools and Colleges http://accjc.org

Allen, M.J. (2004). Assessing academic programs. Bolton, MA: Anker Publishing

American Association of Colleges and Universities (AAC&U). (2002). Greater Expectations: A New Vision for Learning as a Nation Goes to College http://greaterexpectations.org

American Association of University Professors (AAUP). (1970). 1940 Statement of Principles on Academic Freedom and Tenure with 1970 Interpretive Comments. http://aaup.org/statements/Redbook/1940stat.htm

Angelo, T. A., & Cross, K. P. (1993). Classroom assessment techniques: A handbook for college teachers (2nd ed.). San Francisco, CA: Jossey-Bass Angelo, T. A. (1995a). Improving classroom assessment to improve learning: Guidelines from research and practice. Assessment Update, 7(6), 1-13.

Page 44: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 44

Angelo, T.A. (1995b). Reassessing (and Defining) Assessment. Assessment Bulletin, 48(3), 7. Angelo, T.A. (May, 1999). Doing Assessment As If Learning Matters Most. http://aahebulletin.com/public/archive/angelomay99.asp

Astin, A.W. (1993). Assessment for excellence: The philosophy and practice of assessment and evaluation in higher education. Phoenix, AZ: Oryx Press

Banta, T.W. (Ed). (1988). Implementing outcomes assessment: Promise and perils. New directions for institutional research, no.59. Vol. XV, No.3. San Francisco, CA: Jossey-Bass.

Banta, T.W., Lund, J.P., Black, K.E., & Oblander, F.W. (1996). Assessment in practice: Putting principles to work on college campuses. San Francisco, CA: Jossey-Bass Publishers.

Barr, R. B., & Tagg, J. (1995). From teaching to learning: A new paradigm for undergraduate education. Change, 27(6), 12-25

Benander, R., Denton, J., Page, D., & Skinner, C. (2000). Primary trait analysis: Anchoring assessment in the classroom. The journal of general education vol 9(4). University Park, PA; The Pennsylvania State University.

Bers, T. (n.d.) Assessment at the Program Level. California Assessment Website at http://cai.cc.ca.us/workshops/Prog Level Assessment by Bers.doc

Black, P. J., & Wiliam, D. (1998). Inside the black box; Raising standards through classroom assessment. Phi Delta Kappan, 80 (2), 139-144.

Bond, L.A. (1996) Norm- and Criterion-Referenced Testing. At PARE online http://pareonline.net/getvn.asp?v=5&n=2

Boud, D. (1995a). Assessment for learning: contradictory or complementary? Retrieved January 4, 2004, from University of Technology Sydney Web site: http://www.education.uts.edu.au/ostaff/staff/publications/db_9_boud_seda_95.pdfy

Boud, D. (1995b). Developing a typology for learner self-assessment practices. Retrieved January 18, 2004, from the University of Technology Sydney Web site: http://www.education.uts.edu.au/ostaff/staff/publications/db_15_boud_brew.pdf

Boyer, C.M.. & Ewell, P. T. (1988). State-based approaches to assessment in undergraduate education: A glossary and selected references. Denver, CO: Education Commission of the States.

Brookhart, S. M. (1999). The art and science of classroom assessment: The missing part of pedagogy. ASHE-ERIC Higher Education Report (Vol. 27, No.1). Washington, DC

California Assessment Institute http://cai.cc.ca.us/

Page 45: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 45

California Master Plan For Education http://www.sen.ca.gov/masterplan/http://www.sen.ca.gov/masterplan/020909THEMASTERPLANLINKS.HTML

Center for Student Success at http://css.rpgroup.org/

Chappuis, S., & Stiggins, R.J. (2002). Classroom assessment for learning. Educational Leadership, 60(1), 40-43

Clark, D. (2004). Bloom’s Taxonomy (1956). http://www.coun.uvic.ca/learn/program/hndouts/bloom.html

Collins, L. (2002). The proposed accreditation standards: a proposed summary critique. Senate rostrum: Academic senate for California community colleges newsletter. Sacramento, CA: Academic Senate for California Community Colleges.

College Student Experiences Questionnaire http://www.iu.edu/~cseq Comins, N.F. (2000). An in-your-face approach about student misconceptions in astronomy. Available at http://kramer.ume.maine.edu/~panda/comins/miscon.html

Community College Survey of Student Engagement http://www.ccsse.org/

Creel, D.W. (n.d.). Northern Virgina Community College General Education Assessment http://www.nv.cc.va.us/assessment/VAG Gen Ed/VAG Gen Ed.PPT

Dressel, P. (1989). Grades: One more tilt at the windmill. In J. Eison (Ed.), The meaning of college grades (Essay on Teaching Excellence: Towards the Best in the Academy, Vol. 1, No. 6). Retrieved June 12, 2003, from The Professional and Organizational Development Network in Higher Education Web site: http://www.ulib.iupui.edu/teaching_excellence/vol1/v1n6.htm

Eder, D. J. (2003). Primary trait analysis. Retrieved June 4, 2003, from Southern Illinois University, Edwardsville SIUE, Undergraduate Assessment and Program Review Web site:http://www.siue.edu/~deder/assess/

Educational Testing Services at http://www.ets.org/

Erwin, T.D. (2000). The NPEC sourcebook on assessment, volume 1: Definitions and Assessment methods for critical thinking, problem-solving, and writing. Download document from http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2000195

Ewell, P.T. (1999). A delicate balance: The role of evaluation in management. Boulder, CO: National Center for Higher Education Management.

Fair Testing website. The Limits of Standardized Testing. http://www.fairtest.org/facts/Limits of Tests.html

Page 46: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 46

Fair Testing Website. University Testing: Fact Sheets http://www.fairtest.org/univ/univfactsheets.htm

Fitzgerald, R.J. (1999). Total Quality Management in Education. Retrieved from http://www.minuteman.org/topics/tqm.html

Fleming, N. D., & Bonwell, C. C. (1998). VARK a guide to learning styles. Retrieved January 12, 2004, from the VARK website at http://www.vark-learn.com/english/index.asp

Flynn, W.J. (2004). Why not assess and document all learning? Learning Abstracts 7(3). Retrieved April 5, 2004 from the league of Innovation website at http://www.league.org/publication/abstracts/learning/lelabs0403.html

Harvard-Smithsonian Center for Astrophysics, Science Education Department, Science Media Group. 1987. A Private Universe (Videotape Documentary). Accessible at the Annenberg website at http://www.learner.org/resources/series28.html

Huba, M. E., & Freed, J. E. (2000). Learner-centered assessment on college campuses: Shifting the focus from teaching to learning. Needham Heights, MA: Allyn and Bacon.

Johnstone, D.B.(1993) The costs of higher education: Worldwide issues and trends for the 1990's. Retrieved May 1, 2004 at http://www.gse.buffalo.edu/FAS/Johnston/TRENDS.HTM

Johnson, J.H. (1997). Data-driven School Improvement. Retrived from http://www.ericfacility.net/ericdigests/ed401595.html (ERIC Document number ED401595)

Kansas State University Assessment and Program Review (2002). How to write student learning outcomes. http://www.ksu.edu/apr/Learning/HowTo.htm

Kansas State University http://www.ksu.edu/apr/OrgsConfs/SLOT/index.htm

Lane College Outcomes Assessment http://www.lanecc.edu/vanguard/learningoutcomes.htm

Leskes, A. (2002). Beyond Confusion: An Assessment Glossary. At the AACU website http://www.aacu.org/peerreview/pr-sp02/pr-sp02reality.cfm

Levisque, Bradby, Rossi, MPR Associates (1996). Using Data from Program Improvement: How Do We Encourage Schools To Do It? http://ncrve.berkeley.edu/CenterFocus/CF12.html

Lowe, J. P. (1994). Assessment that promotes learning. Retrieved June 4, 2003, from Pennsylvania State University, Center for Teaching and Learning Excellence Web site: http://www.psu.edu/celt/Lowe.html

Middle States Commission on Higher Education [MSACHE]. (2003). Student learning assessment; Options and resources. Philadelphia, PA: Middle States Commission on Higher Education

Page 47: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 47

Maki, P. (2002a, January). Developing an assessment plan to learn about student learning. Retrieved May 19, 2003, from the American Association for Higher Education, Assessment Web site: http://www.aahe.org/Assessment/assessmentplan.htm

Maki, P. (2002b, May). Moving from paperwork to pedagogy: Channeling intellectual curiosity into a commitment for assessment. Retrieved April 23, 2003, from the American Association for Higher Education, Bulletin Archives Web site: http://www.aahebulletin.com/public/archive/paperwork.asp

Maki, P. (2002c, January). Using multiple assessment methods to explore student learning and development inside and outside of the classroom. Retrieved May 2, 2003, from the National Association of Student Personnel Administrators, NetResults Web site: http://www.naspa.org/NetResults/article.cfm?ID=558

Maricopa Assessment Plan http://www.pc.maricopa.edu/Administration/IAP/index.html

Math League (2001). Elementary, help with data and statistics http://www.mathleague.com/help/data/data.htm

Mestre, J. (2000). Hispanic and Anglo Student Misconceptions in Math. Available at http://www.ericfacility.net/databases/ERIC_Digests/ed313192.html (Eric document ED313192) Miller, M. A. (1997). Looking for results: The second decade. In American Association for Higher Education (Ed.), Assessing impact: Evidence and action (pp. 23-30). Washington, DC: American Association for Higher Education. Moskal, B.M. & Blake, B.B. (2000). Developing a departmental Assessment plan: Issues and concerns. In The department chair 11(1). Bolton, MA: Anker Publishing. Also available online at http://www.acenet.edu/resources/chairs/docs/Moskal_and_Bath.pdf National Center for Education Statistics http://nces.ed.gov/nationsreportcard/ National Center for Higher Education Management Systems (NCHEMS) http://www.nchems.org National Commission on Excellence in Education. (1983). A nation at risk: The imperative for educational reform. Washington, DC: U.S. Government Printing Office http://www.ed.gov/pubs/NatAtRisk/index.html.

National Research Council [NRC]. (1996). National science education standards. Washington, DC: National Academy Press

National Research Council [NRC]. (2000). How people learn: Brain, mind, experience, and school. Washington, DC: National Academy Press.

National Research Council [NRC]. (2001a). Classroom assessment and the national science education standards. Washington, DC: National Academy Press.

Page 48: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 48

National Research Council [NRC]. (2001b). Knowing what students know: The science and design of educational assessment. Washington, DC: National Academy Press.

National Survey of Student Engagement (NSSE) http://www.indiana.edu/~nsse

Nichols, John (2003). Report from the Project on Accreditation and Assessment. http://cai.cc.ca.us/Resources/LinkedDocuments/QualityOfTheDegreeByAACU.pdf

Nichols, J. O. (1995a). Assessment case studies: Common issues in implementation with various campus approaches to resolution. Flemington, NJ; Agathon Press.

Nichols, J.O. (1995b). A practitioner’s handbook for institutional effectiveness and student outcomes assessment implementation. Flemington, NJ; Agathon Press

Nichols, J.O. (2003). Report from the Project on Accreditation and Assessment. Retrieved May 25, 2004 at http://cai.cc.ca.us/Resources/LinkedDocuments/QualityOfTheDegreeByAACU.pdf

Nichols, J.O. & Nichols, K.W. (1995) The Departmental Guide and Record Book for Student Learning Outcomes Assessment and Institutional Effectiveness. Flemington, NJ; Agathon Press

Noel-Levitz http://www.noellevitz.com

O’Banion, T. (1997a). A learning college for the 21st century. Phoenix, AZ: Oryx Press and the American Council on Education.

O’Banion, T. (1997b). Creating more learning-centered community colleges [Monograph]. Mission Viejo, CA: League of Innovation.

O’Banion, T. (1999, March). The learning college: Both learner and learning centered. Learning Abstracts, 2, 2. Retrieved June 7, 2003, from http://www.league.org/learnab.html

Oregon State Student Affairs Assessment Audit http://oregonstate.edu/admin/student_affairs/research/Assessment Audit form.pdf

Pacheco, D. A. (1999). Culture of evidence. Retrieved June 1, 2003, from the California Assessment Institute, Resources Web site: http://www.ca-assessment-inst.org/Resources/Pacheco.htm

Palomba, C. A., & Banta, T. W. (1999). Assessment essentials: Planning, implementing, and improving assessment in higher education. San Francisco, CA: Jossey-Bass.

Poch, R.K. (1994). Academic Freedom in American Higher Education: Rights, Responsibilities and Limitations. ERIC Digest. ED366262 http://ericfacility.net/databases/ERIC_Digests/ed366262.html

Practical Assessment, Research, and Evaluation http://pareonline.net

Page 49: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 49

Rodrigues, R. J. (2002). Want campus buy-in for your assessment efforts? Retrieved February 12, 2003, from the American Association for Higher Education, Bulletin Archives Web site: http://www.aahebulletin.com/member/articles/2002-10-feature02_1.asp

Rudman, H.C. (1999) Integrating Testing with Teaching at http://pareonline.net/getvn.asp?v=1&n=6

Rudner, L.M. (1994) Questions to Ask When Evaluating Tests. Eric Document number ED385607 at http://www.ericfacility.net/databases/ERIC_Digests/ed385607.html and PAREonline

Scantron at http://www.scantron.com

Schilling, K. M., & Schilling, K. L. (1998). Proclaiming and sustaining excellence: Assessment as a faculty role. ASHE-ERIC Higher Education Reports, 26(3), 1-105. Retrieved January 3, 2003 from Wilson Web database.

Schuyler, G. (1997). A paradigm shift from instruction to learning. (ERIC Document Reproduction Service No. ED414961). Retrieved June 4, 2003 from http://wwwericfacility.net/ericdigests/ed414961

Seymour, D. (1993). Quality on campus. Change, 25(3), 8-18.

Simpson, H. & Adams, J. (February 2002). “Ignore us at your peril!”: The San Francisco accreditation hearings. Senate Rostrum Academic Senate for California Community Colleges Newsletter. Scaramento, Ac: ASCCC

Sinclair Community College (2004). Assessment of student learning: Learning outcomes. http://www.sinclair.edu/about/assessment/outcomes/index.cfm

Soloman, B. A., & Felder, R. M. (1999). Index of learning styles. Retrieved November 15, 2003, from the Northern Carolina University, Felder Resources in Science and Engineering Education Web site: http://www.engr.ncsu.edu/learningstyles/ilsweb.html

Stanford University Bridge Project (2003), Student Misconceptions about Preparing for an Attending College. Available at http://www.stanford.edu/group/bridgeproject/10+Misconceptions.pdf

Steen, L.A. (1999). Assessing Assessment Retrieved November 15, 2003 at http://www.stolaf.edu/people/steen/Papers/assessment.html

Stiggins, R.J. (2002). Assessment crises: The absence of assessment for learning. Phi Delta Kappan, 83(10), 758-765.

Tanner, D.E. (2001. Assessing Academic Achievement. Needham Heights, MA: Allyn & Bacon.

The Student Learning Imperative: Implications for Student Affairs http://www.acpa.nche.edu/sli/sli.htm

Page 50: Chapter 15 Assessment Basics - Columbia Collegeawe.comm.gocolumbia.edu/basis_skills_handbook/Chapter 15 Asses… · someone must walk the floors and make sure that everything works

Chapter 15 50

Tickle, H.A. (1995). Assessment in general education. In J. O. Nichols (Ed.), A practitioner’s handbook for institutional effectiveness and student outcomes assessment implementation (3rd ed., pp. 172-185). Flemington, NJ; Agathon Press.

Trombley, W. (2003). The rising price of higher education. Retrieved May 1, 2004 at http://www.highereducation.org/reports/affordability_supplement/affordability_1.shtml

Udovic,D. (n.d.). Confronting Student Misconceptions in a Large Class. Available at http://www.wcer.wisc.edu/nise/CL1/CL/story/udovicda/TSDUA.htm

University of Washington. Student Learning Outcomes website at http://depts.washington.edu/grading/slo/SLO-Home.htm

University of Washington, Office of Educational Assessment at http://www.washington.edu/oea

Volkwein, J. F. (2003, May). Implementing outcomes assessment on your campus. eJournal of the Research and Practice Group of California Community Colleges.

Walvoord, B. E., & Anderson, V. (1995, November-December ). An assessment riddle: Guidelines from research and practice. In T. W. Banta (Ed.), Assessment Update, 7, 8-11.

Walvoord, B. E. & Anderson, V.J. Effective Grading: A Tool for Learning and Assessment. San Francisco, CA: Jossey-Bass. 1998. Watson, R. J., & Klassen, P. T. (2003, February). Outcomes assessment: A faculty project. Advocate, 20(3), 1-5. Washington, DC: National Education Association Higher Education.

Western Australia University. A guide to writing student learning outcomes. Centre for Advancement of Teaching and Learning website at http://catl.osdo.uwa.ed.au/obe/outcomes

Wiggins, G. (1990). The Case for Authentic Testing. at http://pareonline.net/getvn.asp?v=2&n=2

Wiggins, G. P. (1993a). Assessing student performance: Exploring the limits of testing. San Francisco, CA: Jossey-Bass. Wiggins, G.P. (1993b). Assessment: authenticity, context, and validity. Phi Delta Kappan, 75, 200-208.

Wright, B. D. (1999). Evaluating learning in individual courses. Retrieved June 10, 2003 from the California Assessment Institute Website. http://www.ca-assessment-inst.org/Resources/Wright2.doc

Zull, J. E. (2003). The art of changing the brain: Enriching the practice of teaching by exploring the biology of learning. Sterling, VA: Stylus.