data quality toolbox for registrars mcss workshop december 9, 2003 elaine collins

Post on 12-Jan-2016

213 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Data Quality Toolbox for Registrars

MCSS Workshop

December 9, 2003

Elaine Collins

Quality Data Toolbox

• Artisan Registrar• Medium Computerized data• Raw Materials Medical information• Shaping tools Knowledge, skills• Directions Standards• Measuring tools Editing “tools”• Final Product Cancer record• Goodness Match to standards

Quality Data - Goodness

• Accurate

• Consistent

• Complete

• Timely

• Maintain shape across transformation and transmission

Measuring Tools

• Reabstracting studies

• Structured queries and visual review

• Text editing

• EDITS

• MCSS routine review

Exercises

• MCSS reabstracting study – 2003

• Sites: Breast, Corpus uteri, Lung, Melanoma, Testis, Soft tissue sarcoma

• 2000 diagnosis year

• 12 facilities

• Review of reported data – Structured query

• Review of reported data – Text editing

Reabstracting Studies

• Compares original medical record with reported cancer record

• Considered the “gold standard”

• Labor-intensive; all records used at initial abstracting may not be available; biased by reabstractor’s training and skills

Structured Queries

• Compares coding across series of records sorted by selected characteristics

• Useful for finding pattern discrepancies across many records

• Manual process; some comparisons may be converted to automated edits

Text Editing

• Compares text with coded values for individual records

• Useful for immediately identifying coding problems

• Manual process; most effective on completion of each individual case

EDITS

• Checks range validity for many fields, comparability of few fields for individual records

• Automated process, can be applied on completion of each record or on preparation of batch report; warnings and over-rides are alternatives to failures

• Expansion of interfield edits requires careful logic

Edits Analysis

• Edits to be included in MCSS Set• Edits in Hospital/Staging Edit Sets – C edits are

included in confidential data set• No Text Edits displayed• Criteria

– Valid codes/dates– Alpha/numeric– Timing– Interfield comparisons– Absolute conditions

MCSS Review

• Requests values for missing or unknown data; resolves conflicts between data items from multiple facilities and between data items updated by single facility

• Allows incorporation of information from multiple facilities

• Review for limited number of conditions

Same Discrepancies Found on Different Reviews

0

50

100

150

200

250

300

Reabstracting 216 155 275 149

Visual 99 110 159 66

Text 79 74 77 42

EDITS 0 16 1 5

MCSS 22 4 4 0

CANCER EXTENT STAGE SURGERY

Cancer Registrar – Resource for Quality Data

Registrar

Facility System

Medical Record

Physician

OtherRegistries

Patient

ICD-O

COC

AJCC

SEER

NAACCR

Facility Staff

CommitteesProtocols NCDB

CentralRegistry Quality

Monitors

CDC Cancer Research

CancerControl NAACCR Public

Data Inputs

• Patient data from facility systems

• Medical record reports and notes

• Pathology reports

• Staging forms

• Communication with physician offices

• Communication with other registries

• Communication with patients

Process Inputs

• Registrar training, knowledge, skills

• Coding standards – ICD-O-3, COC, AJCC, SEER, NAACCR

• Interpretations of standards – I&R, SEER Inquiry, Ask NAACCR

• Medical literature – printed and online

• Registry software data implementations

Sources of Error

• Patient data from facility systems

• Medical record reports and notes

• Pathology reports

• Staging forms

• Communication with physician offices

• Communication with other registries

• Communication with patients

Sources of Error

• Registrar training, knowledge, skills

• Coding standards – ICD-O-3, COC, AJCC, SEER, NAACCR

• Interpretations of standards – I&R, SEER Inquiry, Ask NAACCR

• Medical literature – printed and online

• Registry software data implementations

Types of Errors

• Missing/conflicting data

• Shared data errors

• Timing/coding errors

• Standards and interpretations – ambiguities, omissions, confusions, contradictions

• Discrepancies among local/central registry practice and national standards

Software Implementations

• Discrepancies between implementations and national standards

• Lack of registrar knowledge/training on correspondence between registry and exported data

• Logic errors in matching registry data to reporting formats

• Conversion errors

AJCC Staging Dilemma

• Are pathologic nodes required for pathologic stage grouping?

• How do Minnesota registrars answer this question?

Clinical/Pathologic Staging in Study BREAST CORPUS LUNG MELAN TESTIS SARCO

STAGE GROUPING

Single Group

cTcNcM, cST 54 1cTcNpM, cST 18pTcNcM, cST 9 2 2 3 21 1pTpNcM, cSTpTpNpM, cST 2

cTcNcM, pST 1pTcNcM, pST 5 37 4 31 27 10pTpNcM, pST 74 40 20 30 1 3pTpNpM, pST 6 1

Two Groups

c99, p99 3 6 9 2cST, p99 4 1 1c99, pST 4 1 6 1cST, pST 13 5 7 3 6 3

No Staging 1 4 7 5 3

Collaborative Staging

• Provides specific rules for coding known vs unknown staging elements

• Accommodates “best” stage for AJCC stage assignment

AHIMA 75th Annual ConferenceOctober, 2003 Minneapolis:

Coming Events

• Data mining

• ICD-10-CM

• SNOMED

• Natural language processing

AHIMA 75th Annual ConferenceOctober, 2003 Minneapolis:

Challenges

• What is our professional purpose?

• How do we envision ourselves as professionals?

Foundation for Quality Data

• Registrar’s commitment to registry purpose

• Registrar’s knowledge, understanding of cancer data

• Registrar’s management of communication technologies

• Registrar’s advocacy for data use

SUMMARY

• Consistent recording and reporting of quality cancer data requires commitment.

• Routine and regular review of data patterns facilitates data knowledge and quality.

• Passing EDITS assists but does not ensure data quality.

• Data standards change, use the manuals.• Welcome Collaborative Stage.

top related