assessing internet resources in statistics education

Post on 31-Dec-2015

24 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Assessing Internet Resources in Statistics Education. Ginger Holmes Rowell Middle Tennessee State University September 11, 2007. Internet. A Great Source of Statistics Education Learning Resources - PowerPoint PPT Presentation

TRANSCRIPT

Assessing Internet Resources

in Statistics Education

Ginger Holmes RowellMiddle Tennessee State University

September 11, 2007

• A Great Source of Statistics Education Learning Resources– interactive applets,

videos, tutorials, lesson plans, case studies, engaging learning activities, …

InternetInternet

Issues with Internet-basedIssues with Internet-basedInstructional TechnologiesInstructional Technologies

• Amount of information on the Internet can be overwhelming. Now what was

that URL???

• Limited time for• Finding & evaluating

• Selecting & assessing

Assessing - TechnologyAssessing - Technology

As teachers, we are accustomed to assessing student learning.

Assessing - TechnologyAssessing - Technology

Is that different from assessing technology-based learning resources?

OverviewOverview

• A General Approach for Assessing Instructional Technologies– Triadic Assessment

• Example of Existing Assessments– Digital Libraries– MERLOT/CAUSEweb Peer Review

Criteria

• Other Resources/Examples

Internet-Based Instructional Technologies:Internet-Based Instructional Technologies: Large Statistics Projects Large Statistics Projects

Rice Virtual Lab in Statistics

http://onlinestatbook.com/rvls.html

http://illuminations.nctm.org

http://www.math.uah.edu/stat/

Virtual Laboratories in Probability and Statistics

http://www.cvgs.k12.va.us/DIGSTATS/

General Assessment: General Assessment: Instructional TechnologiesInstructional Technologies

• The TLT Group, Teaching, Learning, and Technology  

http://www.tltgroup.org/flashlightP.htm

http://www.tltgroup.org/

• Flashlight Program: For the Study and Improvement of Educational Uses of Technology

Technology AssessmentTechnology Assessment

• Monad Approach

• Dyadic Approach

• Triadic Approach

OutcomeActivityTechnology

Flashlight Project, The TLT Group, Gilbert

Assessing TechnologyAssessing TechnologyAn Example An Example

Select Technology

Select Activity

Desired Outcome

Recognize convergence of empirical distribution to theoretical function. Visualize shape of binomial distribution for different values of n and p.

Simulate a binomial experiment 10, 100 & 1000 times.Vary n and p.

10 Simulations – Low Tech1000 Simulations – High Tech (Computer Simulation)

Binomial Distribution SimulationBinomial Distribution Simulation((n=10, p=0.5n=10, p=0.5))

10 Repetitions

Note the apparent convergence of the relative frequency function (red) to the probability mass function (blue).

100 Repetitions 1000 Repetitions

http://www.math.uah.edu/stat/bernoulli/Binomial.xhtml

BinomialBinomial n=10n=10

p = 0.15 p = 0.5 p = 0.85

Note the change in the shape of the distribution (blue) as p changes.

http://www.math.uah.edu/stat/bernoulli/Binomial.xhtml

One Solution for Internet-based One Solution for Internet-based Instructional TechnologiesInstructional Technologies

• Digital Libraries• Organized, Searchable

• Reviewed & Assessed (sometimes)

• National Science Digital Library (www.nsdl.org)

Digital Library ProjectsDigital Library Projects

http://www.merlot.org/

http://www.mathdl.org/

http://www.smete.org/

http://www.causeweb.org

(Science, Math, Engineering & Technology Education)

MERLOT & CAUSEwebMERLOT & CAUSEweb

– Not just for statistics– An established assessment process

http://www.merlot.org/

http://www.causeweb.org

– Mission: support undergraduate statistics teachers & learners

www.CAUSEweb.orgwww.CAUSEweb.org

• Digital Library for Undergraduate Statistics Education

• CAUSEway workshops

• USCOTS

• Webinars

• Future Opportunities (keep checking this website.)

• Getting Started

• Guidelines for Research

• Readings & Publications

• Literature Index

• Dissertations

• Working Groups and Projects

www.merlot.org

www.merlot.org

MERLOT StatisticsMERLOT Statistics• MERLOT Statistics and CAUSEweb share an

Editorial Board – Roger Woodard, Editor editor@causeweb.org

• Peer reviewed materials appear in both locations & a composite review is shown

• Peer reviews help to:– Pass on expert knowledge– Point out possible downsides

MERLOT/CAUSEweb MERLOT/CAUSEweb Review ComponentsReview Components

• Descriptive– All items in MERLOT and CAUSEweb

have a descriptive component

• Evaluation– Peer-reviewed items have an evaluation

component – Evaluations are shown on the record– MERLOT has a “star” rating system

Review Component: Review Component: DescriptionDescription

• Overview

• Type of Material

• Technical Requirements

• Learning Goals*

• Recommended Uses*

• Target Student Population

• Prerequisites** only required on peer reviewed items

Review Component: Review Component: EvaluationEvaluation

• Content Quality

• Potential Effectiveness as a Teaching Tool

• Ease of Use

• Issues and Comments

CAUSEweb/MERLOT CAUSEweb/MERLOT Review Criteria OverviewReview Criteria Overview

• Quality of ContentQuality of Content (concepts, models and skills): valid & educationally significant

• Likely EffectivenessLikely Effectiveness:: improves ability to learn material, easily integrated, learning goals easily identifiable, conducive for writing good learning assignments, promotes/uses effective learning strategies

• Ease of UseEase of Use: easy first time use, consistent appearance, clear instructions, not limited by technical resources

Content QualityContent Quality

• Does the item present valid concepts, models, and skills?– Is the information presented by the item factually

correct?– Does the item use appropriate vocabulary?– Does the information presented by the item follow

generally accepted notation?– Does the item encourage appropriate statistical

practice?– Does the item integrate graphics and multimedia

when appropriate?

Content QualityContent Quality• Does the item present educationally

significant concepts, models, and skills?– Does the item help develop conceptual

understanding of statistics?

– Is the item non-trivial? Is the level of understanding obtained relative to the amount of time required?

• Is the item dealing with an important topic?– Does it deal with a topic that students typically find

challenging?

Potential Effectiveness as a Potential Effectiveness as a Teaching ToolTeaching Tool

• Are the teaching-learning goals easy to identify?

• Can the item be readily integrated into a statistics course or curriculum?– Does the item fit into standard presentation

of statistics courses?– Can the item be used with standard texts?

Potential Effectiveness as a Potential Effectiveness as a Teaching ToolTeaching Tool

• Does the item promote and/or use effective learning strategies?– Does the item promote active engagement?– Does the item help develop critical thinking

skills?– Will the item promote student discovery?

• Can learning be readily assessed?

Ease of UseEase of Use• How easy is the item to use for the first time?• Does the item have a consistent feel and appearance?• Does the item have clear instructions?• Is the item convenient and inviting to use?• Does the item provide an effective feedback

mechanism?• Can the item be used by individuals with a variety of

backgrounds/technical skills?• Is technical support necessary?• Is the item limited by technical resources such as

Internet connection speed or special plug-ins?

Interested in helping Peer Interested in helping Peer Review for Review for

CAUSEweb/MERLOT?CAUSEweb/MERLOT?

Contact: Contact: editor@causeweb.orgeditor@causeweb.org

Other Technology Assessment Other Technology Assessment Resources & ExamplesResources & Examples

• Network for the Evaluation of Education and Training Technologies

–Comparison Terms

–Example: Robert DelMas paper/software

• ARTIST– Great Resource for Statistics Assessment

– On-line Introductory Statistics Test Builder

General Assessment:General Assessment:Instructional TechnologiesInstructional Technologies

Network for the Evaluation of Education Network for the Evaluation of Education and Training Technologies (EvNet)and Training Technologies (EvNet)

• National multi-discipline, multi-sector network • Committed to improving instructional

technologies (used in Canadian education) through research focused on assessment and evaluation

http://socserv2.socsci.mcmaster.ca/srnet/exsum.htm

Technology Assessment:Technology Assessment:EvNet Best/Worst PracticesEvNet Best/Worst Practices

• Best Practice LabelsBest Practice Labels– Warmware– Inclusive– Access Equity– User Control– Learner Driven– Knowledge building– Meaningful Interaction– Multidimensional– …

• Worst Practice LabelsWorst Practice Labels– Coldware– Blocks access– Expensive/costly– Loss of Control– Destructive of

collaborative learning– No interactivity– …

EvNet, p. 2-3 http://socserv2.socsci.mcmaster.ca/srnet/exsum.htm

Best Practices ExampleBest Practices Example

• Robert delMas: develop & assess software • Software: Sampling Distribution

• Objective: students explore simulations of sampling distributions to learn and/or discover Central Limit Theorem concepts

• Assessment: combine existing research models (Nickerson and Holland et al. "Inductive Reasoning

Model") list features to promote understanding(1996 ISAE Roundtable Conference on the Role of Technology)

https://ore.gen.umn.edu/artist/index.html

ARTIST: Assessment Resource Tools ARTIST: Assessment Resource Tools for Improving Statistical Thinkingfor Improving Statistical Thinking

https://ore.gen.umn.edu/artist/index.html

ARTIST Assessment Builder ARTIST Assessment Builder (for Introductory Statistics)(for Introductory Statistics)

Free login

ReferencesReferences• Cuneo, Carl. "Twenty Criteria for Evaluating Instructional Technologies:

From Best to Worst Practices." EvNet Website. http://socserv2.mcmaster.ca/srnet/tools/tktoc.htm Retrieved July 26, 2004.

• delMas, Robert. "A Framework for the Evaluation of Software for Teaching Statistical Concepts" 1996 IASE Roundtable Conference Proceedings. Website Version. http://www.stat.Auckland.ac.nz/~iase/php?show=8. PDF file 7.delMas.pdf. Retrieved July 15, 2004.

• EvNet, Network for the Evaluation of Training and Technology. http://socserv2.socsci.mcmaster.ca/srnet/exsum.htm Retrieved July 23, 2004.

References ContinuedReferences Continued• Garfield, Joan. "Preface" to 1996 IASE Roundtable Conference

Proceedings. Website Version. http://www.stat.Auckland.ac.nz/~iase/php?show=8. PDF file iforward.pdf. Retrieved July 23, 2004.

• Gilbert, Steven. "What is a Triad?" PowerPoint Presentation. The Teaching, Learning, and Technology Group. http://www.tltgroup.org/media/fl/Triad.htm Retrieved July 2002.

• Gilbert, Steven. The Teaching, Learning, and Technology Group. http://www.tltgroup.org/resources/TranslucentTechnologies5-08-02.htm Retrieved July 2002.

• Shaughnessy, J. Michael. "Discussion: Empirical Research On Technology And Teaching Statistics" 1996 IASE Roundtable Conference Proceedings. Website Version. http://www.stat.Auckland.ac.nz/~iase/php?show=8. PDF file 17.shaughnessy.pdf. Retrieved July 23, 2004.

References ContinuedReferences Continued

• Siegrist, Kyle. "The Binomial Coin Experiment." Virtual Laboratories in Probability and Statistics, 1997-2004. University of Alabama in Huntsville. http://www.math.uah.edu/stat/ Retrieved June 2002.

• The Teaching, Learning, and Technology (TLT) Group. Information Page. The TLT Group. http://www.tltgroup.org/ Retrieved July 2002.

• Valdez, Gilbert. "Evaluation Standards and Criteria for Technology Implementation," North Central Regional Educational Laboratory. http://www.ncrel.org/tandl/eval_standards_and_criteria.htm Retrieved July 23, 2004.

top related