1 congifurable, incremental and re- structurable contributive learning environments dr kinshuk...
TRANSCRIPT
1
Congifurable, Incremental and Re-structurable Contributive Learning
Environments
Dr KinshukInformation Systems Department
Massey University, Private Bag 11-222Palmerston North, New ZealandTel: +64 6 350 5799 Ext 2090
Fax: +64 6 350 5725Email: [email protected]
URL: http://fims-www.massey.ac.nz/~kinshuk/
2
Reusability
• Benefits are widely known
• However, early promises of time and cost savings hae not materialised
• In software reuse, only trivial pieces of code can be used in another context without much effort
3
CIRCLE Architecture
• Only way to increase usability and in the process automatically increase the reusability, is to allow:
implementing teacher to contribute through:
• configuring the learning space
• Incrementally adding and re-structuring
scope and functionality of IES components
Early adoption: HyperITS
4
HyperITS
No pre-defined sequence of operations
• Concepts linked in an interrelationship network
• Inconsistency results in graded feedback leading the learner gradually to the point of start
• Mis-conceptions and missing conceptions are identified.
5
HyperITS
Emphasis on cognitive skills development
• Uses cognitive apprenticeship approach to provide cognitive skills:
• Observation
• Imitation
• Dynamic feedback by learning
• Interpretation of data
• Static feedback from testing
6
HyperITS
Granular design
• Domain concepts are acquired in the context of their inter-relaed concepts
• Interfaces are brought up to give:
• Another perspective on the data set
• Fine grained interface to give details of a coarse grained presentation
• Fine grained basic application to revise steps at a more advanced level
7
HyperITS
Process modelling
• Overcoming the shortcomings of overlay model
• Understanding learner’s mental processes
• Allows finding optimal and sub-optimal paths in learning process
8
HyperITS Architecture
Teacher B
Web based Intelligent Tutoring Application
(ITA)
Learner CIRCLE
architecture tools
General purpose Hyper-ITS modules
Interface and Tutoring
General purpose Marker
Hyper-ITS Builder
Static domain knowledge
Pedagogy base
Optional problem bank
Teacher A
9
Knowledge representationR
VT CT
VU
Q
CU
R = VT + CTR = Q * P
P
CT = R - VTCT = Q * CU
Q = VT / VUQ = CT / CUQ = R / P
CU = CT / QCU = P - VU
VU = VT / QVU = P - CU
VT = R - CTVT = Q * VU
P = R / QP =VU + CU
10
Knowledge repres. framework
Student
User interface
Tutoring module
Tutoring functionality (server) * Dependency calculator * Navigation controller
Tutoring functionality (client) * Discrepancy evaluator * Contextual dependency finder * Dependency activator * Local optimiser * Dynamic feedback generator
Domain layer * Domain concepts, relationships and other conceptual properties
Educational designers
Contextual layer * Expert solution * Expert problem solving approach * Immediate goals
Contextual functionality * Domain representation
initialiser * Random problem generator * Prediction boundary initialiser * Prediction boundary updater
Student model
HyperITS server
Teacher model layer * Pedagogy base * Optional problem bank
Implementing teachers
Core interface and tutoring modules
Internet
Assessment data
Internet
Peers Peers
Internet
11
Domain Layer
Static domain content provided by the designing teacher:
• Concepts, the smallest learning units
• Relationships among concepts
• Priorities associated with the relationships
• Custom operator definitions
• Constraints on backward chaining, if desired
12
Teacher Model Layer
• Consists of the pedagogy base reflecting various tutoring strategies and scaffolding provided by the implementing teacher
• Optional problem bank created by the implementing teacher to situate the concepts in a particular context
• The teacher can also provide additional diverse contexts
13
Contextual Layer
Contains the current goals and structural information of current tasks:
• system’s solution to current problem;
• system’s problem solving approach;
• immediate goals.
This information is dynamically updated along with the learner’s progress in problem solving.
14
Initialization functionality
Domain representation initialiser
initialises the system according to the current learning goal for all types of problems.
Random problem generator
randomly selects concepts to treat as independents and creates their instances by randomly generating values within specified boundaries.
15
Initialization functionality
Prediction boundary initialiser
initialises the boundaries for the overlay model (how far student’s solution can go from expert solution).
These boundaries are used later to evaluate a learner’s action.
16
If independent variable introduced Contextual dependency finder
identifies the dependent concepts that can be derived within in the current state of the problem space.
Dependency activator (client side)
activates the instances of the contextually dependent concepts and invokes the dependency calculator at server to update their current status in the expert solution.
17
If independent variable introduced Dependency calculator (server side)
provides values for the dependent concepts based on domain layer and pedagogy base to update the expert solution.
This functionality allows a learner to adopt a different route to the solution than the one currently adopted by the system.
18
Setting validation bounds for dependent variables
Prediction boundary updater
updates the prediction boundaries used in comparing a learner’s solution with the expert solution. The updater fine-tunes the system’s initial prediction boundaries to match the route to solution adopted by a learner.
19
Validation of learner’s input to dependent variables
Discrepancy evaluator
evaluates the validity of a learner’s attempt by matching it with the expert solution within the prediction boundaries.
20
Validation of learner’s input to dependent variables
Dynamic feedback generator
provides context-based feedback to the learner. The messages are generated dynamically to improve semantics and to prevent monotony.
Granular approach is used in identifying the source of error and for providing feedback.
21
Validation of learner’s input to dependent variables
Dynamic feedback generator
i. Basic misconceptions, where the learner fails to derive a variable due to misconceptions about the critical concepts. In such cases, graded scaffolding is used:a. ask the learner to try again;b. suggests the relationship to be
used;c. provides the calculation data;d. shows the full calculation, and
allows the learner to proceed.
22
Validation of learner’s input to dependent variables
Dynamic feedback generator
ii. Missing conceptions, when learner unsuccessfully tries to derive a variable that requires derivation of intermediate variables the error arising from missing knowledge about intermediate relationships.
System suggests learner to derive the intermediate concept first.
23
Validation of learner’s input to dependent variables
Dynamic feedback generator
iii. If learner unsuccessfully tries to derive some complex concepts, system advises the learner to use a finer grain interface.
The finer grain interface deconstructs the complex concept into components to capture the misconceptions at a fine grain level.
24
Evaluating learner’s process of deriving solution
Local optimiser
identifies the possible relationships and determines the best relationship to use based on the priorities specified in the domain layer.
It allows the system to identify any sub-optimal approach adopted by the learner.