outline for the day

Post on 22-Feb-2016

34 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Outline for the Day. Building Blocks for Digital Curation Programs Standards as Frameworks for Action Lunch 12:00-1:00 Use and Re-use Over Time Sustainability , Advocacy, and Engagement Wrap up and Evaluation There will be a break in the morning and afternoon. - PowerPoint PPT Presentation

TRANSCRIPT

1

Building Blocks for Digital Curation Programs Standards as Frameworks for Action Lunch 12:00-1:00 Use and Re-use Over Time Sustainability, Advocacy, and Engagement Wrap up and Evaluation

There will be a break in the morning and afternoon

Outline for the Day

2

Become familiar with the concepts and objectives of workflows that enable long-term access

Explore options for managing expectations across the lifecycle

Review the fundamentals of user-based evaluation for assessing and improving services

Understand the benefits of developing a culture of assessment for repository sustainability

Goals for Part 3: Use & Re-Use

3

1. Conceptual frameworks2. Organizational infrastructure3. Technological infrastructure4. Resource framework5. Policy framework6. Roles & responsibilities7. Stakeholders

8. Content characteristics9. Standards10. Holistic workflows11. Strategy & planning12. Outreach & advocacy13. Ongoing evaluation

Building Blocks re: Use & Re-Use

4

EFFECTIVE WORKFLOWS

5

DigCCurr: Transition Point in Life of Digital Object

6

Work Flow

“the sequence of processes through which a piece of work passes from initiation to completion”(Oxford English Dictionary, Second Edition, 1989)

7

Converging Developments Tools

Perform functions Populate workflows

Workflows Integrated into repositories or standalone Combine 1 or more tools with human action

Workflow Concepts Definition of workflow:

Description of practice and procedures Automation of repetitive tasks Graphic representation of flow of work

Workflow engine concepts: Orchestration: composition and execution of new

services (definition) Choreography: interaction/coordinated action

between services (description)

9

In practice, the functions have to be pieced together in specific ways that are appropriate to an organizational context or need

If the functions are the verbs, then the workflows are the sentences (or paragraphs...)

Workflows vs. Functions

10

Tasks are discrete Workflows link tasks in a logical fashion Workflows depend upon interoperability

Workflows vs. Tasks

Workflow Influences Critical path method (project management)

1. List all activities2. Determine time (duration) for completion3. Identify dependencies between activities

Process Improvement examples: Six Sigma Total Quality Management (TQM) Business Process Reengineering

Considerations Purpose

As is: document what is happening now To be: document what should happen

Right-sized Appropriate granularity for problem, setting Extent and type of documentation

Maintenance changes in staff, roles New or changed functions New tools and common workflows

EXAMPLES

13

PANIC, 2003-2004

version 0.10 in 2013

16

Discussion 3: Archival Scenario

--What kinds of risks do

archivists need to avoid/mitigate to preserve authentic digital records?

EVALUATION & ASSESSMENT

Evaluation “A systematic process for ascertaining

whether and why an object of study meets the goals envisioned for that object.”

Gary Marchionini & Gregory Crane, “Evaluating Hypermedia and Learning: Methods and Results from the Perseus Project,” Transactions on Information Systems (TOIS) , Vol. 12 Issue 1 (January 1994): 6.

19

Assessment vs. Evaluation Academics tend to use the term “evaluation” for

this constellation of activities. Administrators in higher education and libraries

are coming to use the term “assessment” when thinking about their programs.

In both cases, we want to know how well we are doing and base that assessment in measurable data.

20

High-Level Rationale for Evaluation & Assessment Assessment is the basis of self-

understanding and improvement. Sharing results through publication leads to

profession-wide benchmarks and overall understand and improvement.

A culture of assessment can arise when fostered by administrators and managers.

Rationale The ability to accurately compare and

contrast program metrics with like institutions helps to set standards for services assist in planning for improvements to those

functions. The ultimate goal of these projects was to

provide archivists with data collection and analysis tools to support decision-making.

22

Elements of Evaluation/Assessment

Goals and objectives of object, system, process, etc.

Evaluators. Often other humans, i.e., staff and users of

system. Methodology. Data. Comparison of goals for object, event, process,

etc. under study. Conclusions.

23

Evaluation Issues Complex process. Requires some level of training. Takes time and resources. No single golden method exists. Multiple methods yield best view of “reality.” Rigorous sampling is essential – study is only as

good as the sample. Quantitative / qualitative: an artificial dichotomy? Privacy and Institutional Review Board (IRB)

approval.

24

BUILDING A CULTURE OF ASSESSMENT

What Is a Culture of Assessment? “An organizational environment in which

decisions are based on facts, research and analysis, and where services are planned and delivered in ways that maximize positive outcomes and impacts for customers and stakeholders.”

Amos Lakos & Shelley Phipps, “Creating a Culture of Assessment,” portal: Libraries and the Academy,

Vol. 4, No. 3 (2004), pp. 345–361.

27

Assessment, based on measurable, reliable, and valid data, is a pathway to sustainability.

Assessment metrics make the case for value and continuing funding.

User-based assessment metrics are an advocacy tool, drawing in the user, the collections donor, and the resource allocator.

Why a Culture of Assessment?

Creating a Culture of Assessment User-based evaluation needs to be

established on concepts that are specific to archives and special collections.

If archivists do not do this for themselves, someone else will.

Archival Metrics Toolkits developed by and for archivistsCollaboration between researchers and

practitioners.

28

USER-BASED EVALUATION

29

User-Based Evaluation Tells us how users view our constellation of services,

service delivery, and resources, such as collections. Is not collection-based but user-centric. Can tell us about fulfillment of needs as well as user

satisfaction. Helps information professionals better allocate their

resources for improved performance and user satisfaction.

Few user-based studies have been conducted and fewer published, especially from archives and museums, so there is much to learn.

Little comparability across studies conducted. 30

Evaluation of Digital Projects Designed to:

assess what has been achieved with large investments of resources.

improve current provision (e.g. usability testing, interface design).

provide public accountability (often public funds are used).

Fulfill requirements of funding bodies.

31

Evaluation of Digital Projects Helps to show who the users are and what

they are doing/requesting. Measures the project’s/program’s impact

(e.g. on teaching, research, lifelong learning).

Informs future developments and as a basis for future projects or grants.

32

And…. No good understanding yet of the use of

digital objects in collections. We don’t know what people want/need in

terms of:The collection,Services, or What constitutes good service.

33

ARCHIVAL METRICS PROJECTS

34

www.archivalmetrics.org

35

Archival Metrics Toolkits Developed to meet the needs of archivists

in evaluating their services to specific user groups: Researcher Archival WebsiteOnline Finding Aids Student ResearcherTeaching Support

36

37

Discussion 4: Archival Scenario

--1. What types of evaluation

have you conducted in your repository?

2. What types would you like to conduct?

top related