4 nov 2000ccsc se grading student programs: a software testing approach 14th ccsc southeastern...

28
4 Nov 2000 CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department Florida A&M University Tallahassee, FL [email protected]

Upload: juliet-mosley

Post on 31-Dec-2015

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Grading Student Programs:

A Software Testing

Approach14th CCSC Southeastern

Conference November 3-4, 2000

Edward L. JonesCIS Department

Florida A&M UniversityTallahassee, FL

[email protected]

Page 2: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Outline

• Motivation

• Tester’s Approach

• Assignment Life Cycle

• An Example

• Lessons Learned

• Future Work

Page 3: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

In 25 Words or Less

• Automation is possible

• Requires up-front teacher investment

• Requires behavior adjustment by students

• Problem is well-structured

• High potential for reuse

• Just Do It!!

Page 4: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Motivation

• Enhance the learning experience

• Grading is labor-intensive

• Difficulty of consistency

• Class size increases workload

• Redundancy of explaining deductions

Page 5: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

A Tester’s Approach

• Attitude -- finding what’s wrong!

• Consistent, repeatable error search process

• Careful documentation of results/findings

• Automation to reduce time investment

• Black-box testing based on specification.

Page 6: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Programming Environment

PR(public)

Teacher

SR(submission)

Student

submit

getcopy

postgrade

getassignment

viewgrade

postassignment

Page 7: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

An Example

• COBOL programming assignment

• Example of HOW TO do it

• Good sense of up-front effort

• Repeatable/reusable pattern

• Expose difficulties

• Insights into possibilities

Page 8: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Program Grading Life Cycle

Prepare Assignment Implement Grader Grade Programs

Write Program

Grading Log

Grade Report

Student ProgramAssignment

Specification

Assignment Specification

Test plan

Test cases

Test driver

Checker script

Automated Grader

Page 9: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Lessons Learned

• Student shock & outrage at exactness• Specification must be better than average!• Front-end loaded: Assure testability up front • First-time automation is costly• Amortized via similar assignment styles• Students need the grader before submitting• Students need to learn how to read, interpret

and satisfy a specification

Page 10: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Automation Challenges

• Does the teacher have the time?

• Is the automated grader TOO STRICT for CS1/CS2?

• What to ignore?

• Deduction schedules become complex – aggregate vs specific search strategy

• The “just a little more automation” trap!!

• The guts to do it!!

Page 11: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Future??

• Do it again!

• Do it on another class, CS1-CS3!

• Investigate table-driven checker scripts – reduce costliest step.

• Distribute grader along with assignment!

• Sell colleagues on the idea!

Page 12: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Questions?

Questions?

Questions?

Page 13: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Thank You

Page 14: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Interactive Input Requirements (1)

(1) The program must be menu-driven based on TRANSACTION CODE. ----------------------------------------------- ENTER TRANSACTION CHOICE: 1 -- Open Account 2 -- Deposit 3 -- Withdraw 4 -- Close Account 5 -- Check Balance 9 -- QUIT PROGRAM ------------------------------------------------

Page 15: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Interactive Input Requirements (2)

(2) Sequence of interactive data entry, per transaction code:

Open (01) : Account# Amount Lastname FirstName Deposit (02) : Account# Amount Withdraw (03) : Account# Check# Amount Close (04) : Account# LastName FirstName BalCheck (05) : Account#

Page 16: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Output Requirements - Messages

A. VALIDATION ERRORS

- BAD TRANSACTION CODE - DEPOSIT AMOUNT TOO HIGH - DEPOSIT AMOUNT TOO LOW - MISSING ACCT OWNER NAME - MISSING INITIAL DEPOSIT B. UPDATE ERRORS - - ACCOUNT ALREADY EXISTS - ACCOUNT DOES NOT EXIST

- ACCOUNT HAS NON-ZERO BALANCE - INSUFFICIENT FUNDS

C. UPDATE COMPLETION

- DEPOSIT MADE - WITHDRAWAL MADE - BALANCE DISPLAYED - ACCOUNT OPENED - ACCOUNT CLOSED

Page 17: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Processing Requirements

BUSINESS RULES (excerpts)

2. An initial deposit AMOUNT must be given for

OPEN.

3. The initial deposit AMOUNT must be at least

$20.00.

5. The deposit AMOUNT must NOT exceed

10,000.00.

6. A withdrawal AMOUNT may not exceed the

Balance.

7. An account can not be closed unless it has a

ZERO balance.

10. The OPEN account must FAIL if the account

already exists.

Page 18: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Test Plan (1)

1. There will be three test runs

1.1 Nominal -- valid transactions, successful updates Goal: Demonstrate correct behavior in "perfect" world.

1.2 Update Errors -- valid transactions, update failures. Goal: Demonstrate ability to detect and correctly respond to master file conditions that prevent update.

1. 3 Transaction Errors -- invalid transactions Goal: Demonstrate ability to detect and correctly respond to invalid transactions.

Page 19: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Test Plan (2)

2. All test runs use the same master file.

3. Test Execution & Results Verification

3.1 For each test run there will be an input test data set

3.2 For each test run there will be a verification script that scans the program output file (p7_audit.rpt) for expected results.

3.3 Each verification script tallies deductions for failure to produce expected output.

3. The verification scripts write results to log file p7_user.log.

Page 20: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Test Plan (3)

4. File Naming Conventions

Test run 1 2 3 Test Data Set testdata1 testdata2 testdata3 Verification Script checkscript1 checkscript2 checkscript3

Page 21: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Test Driver

# -------------------------------------------------# Test Driver: Perform 3 test runs.# -------------------------------------------------

set slog=p7_{$1}.logecho "p7 Grading Log for student $1" > $slog

set totpen=0foreach run ( 1 2 3) #-| Run program & check results. cobrun $1 < testdata$run csh checkscript$run $1 p7_audit.rpt set runpen = $status

@ totpen=$totpen + $runpen echo “$1–RUN $run PENALTY = $runpen" end # for loop

echo “Student $1 – TOTAL PENALITIES = $totpen" >> $slog

exit

Page 22: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Test Cases

Run #1 NOMINAL Test Cases

VALID transaction, SUCCESSFUL update Test Sequence / Expectation:

1) BAL CHECK -- ACCT (1111) / Bal = 250.55

2) DEPOSIT -- ACCT (1111) 249.45

3) BAL CHECK -- ACCT (1111) / Bal = 500.00

4) CLOSE -- ACCT (8888) Wine Brandy

5) WITHDRAW -- ACCT (9999) 26.00

6) BAL CHECK -- ACCT (9999) / Bal = 175.00

7) OPEN -- ACCT (5555) 876.54 Jones Ed

8) BAL CHECK -- ACCT (5555) / Bal = 876.54

-------------------------------When Master File is:-------------------------------

1111 000025055 010190 McNair Stub

2222 000560000 020293 Simmons Joe

3333 000000100 123199 Fisher Kelly

4444 000000750 012500 Broke Eyem

6666 000998900 071092 Beuche Bobby

8888 000000000 011595 Wine Brandy

9999 000002100 020100 Faulk Mark

Page 23: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Checker Script (1)

# -------------------------------------------------------# Filename checkscript1# Purpose: Determine deductions for wrong output. RETURN #points deducted.## Invocation: checkscript1 student resultsfile# -------------------------------------------------------set pen=0set log=p7_{$1}.log

#-| AGGREGATE counts -- transaction completion messages.#-| Search results written to student's grading log.foreach msg (OPENED CLOSED DISPLAY WITHDRAW DEPOSIT) grep $msg $2 >> $log if ($status) then @ pen = $pen + 1 echo "$1 - WRONG - missing $msg MESSAGE" >> $log endif end

Page 24: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Checker Script (2)#------------------------------------------------------------------------#-| SPECIFIC Search: 249.45 DEPOSIT accepted.egrep "[0]*2 " $2 | grep 249.45 >> $logif ($status) then @ pen = $pen + 1 echo "$1 - WRONG - missing DEPOSIT 249.45 MESSAGE" >> $logendif

#-| SPECIFIC Search: 1111 DEPOSIT applied correctly.egrep "1111" $2 | grep 500.00 >> $logif ($status) then @ pen = $pen + 1 echo "$1 - WRONG - BALANCE after DEPOSIT not 500.00" >> $logendif…#-| Return the penalty points.echo "$1 - script1 PENALTY POINTS = $pen" >> $logexit $pen

Page 25: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Program Grading Log (1)

p7 Grading Log for tmorris____________________________ 007 SUCCESS ACCOUNT OPENEDtmorris - WRONG - missing CLOSED MESSAGE001 SUCCESS BALANCE DISPLAYED003 SUCCESS BALANCE DISPLAYEDBALANCE DISPLAYED005 SUCCESS WITHDRAWAL POSTED006 SUCCESS BALANCE DISPLAYED007 SUCCESS ACCOUNT OPENED008 SUCCESS BALANCE DISPLAYED001 VALID002 VALID003 VALID004 INVALID MISSING AMOUNT008 VALID002 2 1111111 249.45003 1111111 500.00 McNair Stub007 1 5555555 30300 876.54 Jones Ed7 5555555 876.54 000000 Jones Ed

tmorris - script1 PENALTY POINTS = 1

Page 26: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Program Grading Log (2)

003 VALID 4 ACCOUNT CLOSEDtmorris - WRONG - extraneous CLOSED MESSAGEtmorris - WRONG - missing FAIL MESSAGE001 INVALID ACCT DOES NOT EXIST002 INVALID ACCOUNT ALREADY EXItmorris - WRONG - CLOSE -- missing NON-ZERO BALANCE message004 INVALID INSUFFICIENT FUNDS005 INVALID ACCT DOES NOT EXISTtmorris - script2 PENALTY POINTS = 3001 SUCCESS ACCOUNT OPENED003 SUCCESS ACCOUNT OPENEDtmorris - WRONG - extraneous OPENED MESSAGE004 INVALID DEPOSIT AMOUNT TOO HIGH005 INVALID DEPOSIT AMOUNT TOO LOWtmorris - WRONG - extraneous DEPOSIT MESSAGE004 INVALID DEPOSIT AMOUNT TOO HIGH001 1 7777777 30300 5.00 OPENDeposit TooRLOW001 7777777 5.00 000000 OPENDeposit TooRLOWtmorris - WRONG - OPEN -- missing MISSING ACCT OWNER NAME messagetmorris - WRONG - OPEN -- missing MISSING ACCT OWNER NAME message004 INVALID DEPOSIT AMOUNT TOO HIGH005 INVALID DEPOSIT AMOUNT TOO LOWtmorris - WRONG - OPEN -- missing MISSING INITIAL DEPOSIT messagetmorris - script3 PENALTY POINTS = 5

tmorris - TOTAL RUN PENALITY POINTS = 9

Page 27: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

Program Grading Life Cycle

Prepare Assignment Implement Grader Grade Programs

Write Program

Grading Log

Grade Report

Student ProgramAssignment

Specification

Assignment Specification

Test plan

Test cases

Test driver

Checker script

Automated Grader

Page 28: 4 Nov 2000CCSC SE Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department

4 Nov 2000 CCSC SE

SPRAE - A Tester’s Framework

• Specification: basis for testing

• Premeditation: no plan, no test

• Repeatability: deterministic outcome.

• Accountability: full disclosure

• Economy: cost-effectiveness.