online peer evaluation system team green apple team members ada tse amber bahl tom nichols matt...
TRANSCRIPT
Online Peer Evaluation System Team Green Apple
Team MembersAda Tse
Amber BahlTom Nichols
Matt Anderson
Faculty Mentor Prof. M Lutz
Project SponsorRichard FasseRIT Online Learning
Agenda
• Project Overview • Current System• Our Product Features• Requirements Process• Product Architecture and Design• Software Process • Risk Analysis and Mitigation• Metrics• Status & Future Goals • Demo
Problem Statement
• The RIT Online Learning department is in need of an online peer evaluation system that can enhance the collaborative learning experience.
Existing Tool: – Paper
Alternative – Clipboard Survey System
Importance
• Group work is an important aspect in today's education system
• The average SE graduate does about 16 group projects
Current System: Clipboard
• Create, Deploy and Analyze – Does provide different views for analysis but more effective for analyzing surveys then Peer
Evaluations. – Very Hard to identify problem groups
• Not integrated with myCourses• Survey System • Can’t deploy evaluations per group• Hard to setup • Reporting does not show group weaknesses• No control over who takes the survey
Current System: Reporting View
View: Percentage/ Graph
Current System: Reporting View
Solution: Custom Application
Peer Evaluation System
• Integrated with existing system– Login pass-through– Course and group data imported directly from myCourses
• Setup Workflow – Tailored for peer evaluations
• Question Templates – Reusable– Shared between instructors
Application Workflow
Instructor Main-Create Eval
Student main -Take Eval
Instructor Main-Reporting
1. Create Evaluation
2. Take Evaluation
3. Analyze Results
WOW!!
Instructor Main
List of global and personal questions templates
Evaluation status
Evaluations listed per course
Solution: Create Evaluation
Select Template
Eval Setup Info
Solution: Create Templates
Global/ Personal
Solution: Students View
Instructions
All students of a group.
Solution: Reporting
• Reporting (Provided with the help of multiple views)
– Multiple levels of detail• By group
• By student
– Sorted by groups or individuals– Quickly identify problem groups
Solution: Reporting View
Solution: Reporting View
Requirements Process
• Mainly elicited by:– In-person Interviews
• Project Sponsors• Subject Matter Experts • Online Learning Technical Staff
– UI Mockups– Evaluating
• RIT Clipboard • Peer Evaluation Templates
Requirements Analysis
• Use Case Analysis
• Workflow Diagrams – Workflow Steps
• Constant user feedback at the end of each Sprint
Product Architecture and Design
Entity Relationships
Evaluation
PK id
start_date end_date group_type
Group
PK,FK1 id
group_name
HasStudents
PK,FK1,FK2 id
Response
PK,FK1,FK2 id
ToStudentID FromStudentID QuestionID Answer GroupID Comment
Template
PK id
nameFK1 owner_id shared global
template_questions
PK id
question_text answer_type orderFK1 parent_id
evaluation_questions
PK id
question_text answer_type orderFK1 parent_id
Person
PK id
username name role
course_section
PK id
name
has_section
FK1 instructor_idFK2 course_section_id
GroupType
PK,FK1,FK2 id
name eval_id
Data Model Architecture
Package Diagram
Deployment Diagram
Software Process
Process: Scrum
24-hour inspection
iteration
SCRUM – 15 minute daily meeting1.) What did you do since last Scrum Meeting?2.) Do you have any obstacles?3.) What will you do before next meeting?
New functionality is demonstrated at
end of sprint}Product BacklogPrioritized product features desired by the customer
Sprint BacklogFeature(s) assigned to sprint
Backlog items expanded by team
• What is Scrum?– Scrum is an iterative, incremental process for developing any product or
managing any work. It produces a potentially shippable set of functionality at the end of every iteration (Sprint).
Scrum: Sprint
• Typical team size 2 to 4 members
• Delivers working software– Typically between 1-4 week iterations
• Cross-functional tasks per team member
• New work may be uncovered by the team during development
Our Methodology
• Flavor of Scrum
• Differences:– Upfront requirements – Postponed the Sprint one delivery date by 2 weeks
• Similarities: – The whole project was implemented in chunks (Sprints) depending on the
requirements prioritization (Sprint Backlogs).– Team meetings
Risk Analysis and Mitigation
Risk
• New Technologies– .NET
• Integration with myCourses– XML Feeds – Testing
• LDAP Authentication
• Complexity of business requirements
Risk Mitigation: Task Planning
• New Technologies– Allocated tasks according to skill set– Team members started off with small/simple
programs– Experienced team members educated the
team
Risk Mitigation: Development
• LDAP & myCourses integration – Great help from the Online Learning
• Complex business requirements– Incremental development & comprehensive
requirements gathering
Risk Mitigation Plan: Software Process
• Use of Scrum• User Feedback (Allows for midcourse corrections)• Increased Product Visibility• Increased Progress Visibility
– Sprint Planning • Through many sprints the requirements were revised many times to ensure that
clarity is achieved.• Throughout every sprint, each decision will be evaluated to make sure that it
aligns with the overall goals of the project.
Risk Mitigation: Tooling
• Subversion for revision control• Google groups• Trac provides web based management
– View files and changesets
• Automated synchronization of project documents to web site
• Trac provided an integrated bug tracking system
Data Collection
Metrics
• Backlogs– Product – Sprint
• Number of tasks completed for a particular sprint (work effort distributed for each sprint)
• Number of bugs– By Feature– By Severity– Per Sprint
• Total effort (man hours) for all phases
Effort Metrics
Bugs Per Feature
Total # of bugs: 53
Major: 22Minor: 11Trivial: 20
Current Status
• Progress
Key Features Progress
Requirements Elicitation DONE
Requirements Analysis (SRS) DONE
High Level Architecture DONE
Initial Setup (DB, Environment) DONE
Requirements Prioritization DONE
Sprint 1 (5th Week) DONE
Sprint 2 (7th Week) DONE
Sprint 3 (9th Week) DONE
Integration Testing DONE
Final Release 05/19/2006
Future Enhancements
• More views for reporting – Currently our application supports 2 views
• High-level groups + students• Team member + responders + questions
• Better support for answer type – Currently our application supports
• Text Type• Radio Button
Reflections
• Great Team!!! – All team members were new to the group
• Appropriate Software Process Model • Delays in Sprint 1
– Unknown Technologies– .NET 2.0
Demo
• Peer Evaluation System
Questions
• Thank you!
Supporting Data
Supporting Data
Requirements Initial Setup Analysis and Design Sprint 1 Sprin 2 Sprint 3 Integration Testing Midquarter/ Final Presentation94 16 32 180 89 132 30 50
Challenges
• Uniformity– Rating System– Question System
• Faculty View
• Different User Types
• Synchronization with myCourses