encounter data validation: review and project update august 25, 2015 presenters: amy kearney, ba...
TRANSCRIPT
Encounter Data Validation:Review and Project Update
August 25, 2015
Presenters:
Amy Kearney, BA
Director, Research and Analysis Team
Thomas Miller, MA
Executive Director, Research and Analysis Team
1
Welcome
About the presenters Rules for engagement Presentation overview
• Understanding Encounter Data Validation (EDV) studies
• SFY 2014-15 EDV Results and Recommendations
• Encounter data quality improvement activities
2
Meeting Objectives
1. To understand the structure and purpose of Encounter Data Validation studies
2. To review the SFY 2014-15 EDV results and recommendations
3. To review ongoing efforts to improve encounter data quality
3
What’s an EDV?
Encounter Data Validation– Optional EQR activity
Assess completeness, timeliness, and accuracy of encounter data submitted to a state by its MCOs
4
Importance of EDV
State Medicaid agencies rely on quality of encounter data submissions to:– Accurately and effectively monitor and
improve its program’s quality of care– Establish appropriate performance measures
and acceptable rates of performance– Generate accurate and complete reports– Obtain complete and accurate utilization
information
5
AHCA’s Annual EDV Studies
2013-14 EDV– Information Systems Review– AHCA Encounter Data File Review– Medical Record Review (MRR)
2014-15 EDV– Encounter Data File Review– Comparative Analysis– MRR
2015-16 EDV– Study design in progress
6
SFY 2014-15 EDV: Study Design
Objectives– Determine extent to which encounters
in Florida’s Medicaid Management Information System (FMMIS) are complete and accurate when compared to plans’ data
– Completeness and accuracy of plans’ encounter data stored in FMMIS through MRR
7
SFY 2014-15 EDV: Study Design
Evaluation Components– Encounter data file review– Comparative analysis– MRR
Dates of Service– January 1, 2013 – March 31, 2014
Examined three encounter types– Professional– Dental– Institutional
8
SFY 2014-15 EDV: Study Design
Encounter Data File Review– Examined the extent to which data submitted by
AHCA and the plans were reasonable and complete
– Unique encounters were identified by using a combination of plan, recipient ID, provider identification number, and date of service• Unique control number not utilized across plans and
AHCA
9
SFY 2014-15 EDV: Study Design
Encounter Data File Review, continued– Key measures
• Volume of submitted encounters over time• Percent of encounter data fields with value
present• Percent of encounter data with valid values• Documentation of anomalies associated with
data extraction and submission were documented
10
SFY 2014-15 EDV: Study Design
Comparative Analysis– Based on encounter data records present in
AHCA’s and plans’ encounter data:
• Element Omission- Were data elements present in plans’ files not present in AHCA’s files?
• Element Surplus- Were data elements present in AHCA’s files not present in plans’ files?
• Element Agreement- For data elements present in both sources, did the values match?
11
SFY 2014-15 EDV: Study Design
Comparative Analysis, continued– Three Key Steps:
1. Develop data submission requirements
2. Conduct file review
3. Conduct comparative analysis of encounter data
12
SFY 2014-15 EDV: Study Design
MRR– Assessed whether key data elements in AHCA’s
data were complete and accurate when compared to medical records
– Four Key Steps:
1. Identification of eligible population and generation of MRR samples
2. Medical record procurement
3. Medical record abstraction
4. Conduct MRR analysis of abstracted data
13
SFY 2014-15 EDV: Study Design
MRR, continued Assessed whether key data elements were complete and accurate
14
Key Data Elements for Medical Records Review
Key Data Fields Professional Dental Institutional
Date of Service √ √ √Diagnosis Code √ √CPT/CDT/HCPCS Code/ Surgical Procedure Code √ √ √
Procedure Code Modifier√ √ √
SFY 2014-15 EDV: Study Design
MRR, continued– Four study indicators to report results:
• Medical Record Omission• Encounter Data Omission• Coding Accuracy• Overall Accuracy
15
Meeting Objectives
1. To understand the structure and purpose of Encounter Data
Validation studies
2. To review the SFY 2014-15 EDV results and recommendations
3. To review ongoing efforts to improve encounter data quality
1616
Meeting Objectives
1. To understand the structure and purpose of Encounter Data
Validation studies
2. To review the SFY 2014-15 EDV results and recommendations
3. To review ongoing efforts to improve encounter data quality
1717
EDV Results: Encounter Data File Review
Figure 1—Monthly Variations in Professional Encounters for Plans and AHCA
18
EDV Results: Encounter Data File ReviewFigure 2—Monthly Variations in Dental Encounters for
Plans and AHCA
19
EDV Results: Encounter Data File Review
Figure 3—Monthly Variations in Institutional Encounters for Plans and AHCA
20
Variation was present in the overall and month-to-month submission of encounters by type and source; the greatest variation was noted with institutional encounters
Required data elements (e.g., Recipient ID, Procedure Code, and Primary Diagnosis) were consistently complete and contained reasonable values
EDV Results: Encounter Data File Review
21
EDV Results: Encounter Data File Review
Recommendations– Investigate differences identified in monthly
encounter data volume and reconcile where appropriate • Review activities should focus on determining whether
differences are due to failed or incomplete submissions or processing parameters
22
EDV Results: Comparative Analysis Record Completeness
– Findings• Record omission and surplus rates varied considerably
across plans– Dental encounters were the most complete (11.9 percent
omission rate and 30.0 percent surplus rate)– Institutional encounters were the least complete (84.7 percent
omission rate and 41.1 percent surplus rate)
– Recommendation• Review and update encounter data submission
standards to ensure they meet current reporting requirements
• Conduct root cause analyses related to low performing encounter types and elements
23
EDV Results: Comparative Analysis Encounter Data Element Completeness
– Overall, encounter data elements exhibited high level of completeness across all encounter types
– Provider-related encounter data elements were most frequently associated with incomplete data• Referring Provider NPI—professional omission and
surplus rates > 10%• Billing Provider NPI—dental omission rate = 13.6%• Rendering Provider NPI—dental surplus rate = 17.5%• Attending and Referring Provider—institutional
surplus rate > 75%
– Encounter data element omission and surplus rates varied by field and plan
24
Study Results: Comparative Analysis Encounter Data Element Completeness, continued
– Dental encounters• Omission rates showed less variation across plans than
other encounter types• Encounter data element surplus rate differences were
greatest for Line Date of Service, Billing Provider NPI, and Rendering Provider NPI
– Institutional encounters• Omission and surplus rate variation
was mixed• Omission rates for nearly half of
evaluated elements exhibited minimal variation across plans
25
EDV Results: Comparative Analysis
Encounter Data Element Completeness, continued– Recommendations
• Review State and plan processes for submitting, tracking, and storing provider information
• Continue collaborative activities focused on exploring reasons for incomplete data submissions and developing improvement strategies
26
EDV Results: Comparative Analysis
Encounter Data Element Agreement– Professional encounters
• Agreement rates were high for key data elements with Procedure Code, NDC, and Primary Diagnosis Code exhibiting agreement rates of at least 90%
– Dental encounters• High levels of agreement for key data elements with
the exception of Dental Procedure Code
– Institutional encounters• High levels of agreement for one-third of key data
elements
27
EDV Results: Comparative Analysis
Encounter Data Element Agreement, continued– Recommendations
• AHCA should continue working collaboratively with key stakeholders to develop and implement a monitoring strategy to routinely examine claims and encounter volume
• AHCA and the plans should regularly review existing contracts and encounter data documentation to ensure clear expectations surrounding the collection and submission of data
28
EDV Results: MRR Medical Record Submission
– 1,234 sample cases requested for MRR• 981 records submitted by plans• 192 records where provider refused to submit• 59 records where provider
was unable to locate• 2 records were
submitted for incorrect patient
29
EDV Results: MRR Encounter Data Completeness
– AHCA’s encounter data was only moderately supported by enrollees’ medical records (medical record omission)
– Plans’ encounter data was only moderately supported by AHCA’s encounter data (encounter data omission)
30
EDV Results: MRR Encounter Data Element Accuracy
– Overall encounter data element accuracy was high• Diagnosis codes (95.4 percent)• Procedure codes (82.3 percent)• Procedure code modifiers (99.3 percent)
– Only 1/3 of encounters accurately represented all three elements relative to medical record
31
Table 6-2—Encounter Data Element Accuracy Summary for Overall Population
Statewide Rate MCP Range Main Error Type
Diagnosis Code 95.4% 87.0% - 100% -
Procedure Code 82.3% 66.1% - 100% Inaccurate Code (37.0%) Higher Level Service (26.4%) Lower Level Service (36.6%)
Procedure Code Modifier 99.3% 75.0% - 100% - All-Element Accuracy 31.9% 0.0% - 79.2% -
EDV Results: MRR
Recommendations– AHCA should continue working collaboratively
with key stakeholders to develop and implement a monitoring strategy that audits provider encounter submissions for completeness and accuracy
32
Questions?
33
Meeting Objectives
1. To understand the structure and purpose of Encounter Data
Validation studies
2. To review the SFY 2014-15 EDV results and recommendations
3. To review ongoing efforts to improve encounter data quality
3434
Meeting Objectives
1. To understand the structure and purpose of Encounter Data
Validation studies
2. To review the SFY 2014-15 EDV results and recommendations
3. To review ongoing efforts to improve encounter data quality
3535
Improving Encounter Data Quality
AHCA explored reasons for incomplete encounter data submissions from plans and began developing strategies to improve rates.– Developed encounter data support process– Worked to improve encounter data submission
issues, timeliness, and accuracy• Dedicated email account• On-site plan visits• Webinars• Conference calls
36
Improving Encounter Data Quality
AHCA ensured there was a reliable process for timely submission of data from plans.– Implemented timeliness reports that are provided
to plan managers. – Plan managers work
with the plans to ensure contract timeliness compliance.
37
Improving Encounter Data Quality
AHCA organized a Webinar explaining comprehensive list of operational edits associated with error categories identified in the feedback/response files. – HP created encounter data reports listing
operational edits the plans are receiving on encounter files.
– Worked with plans by phone, webinar and on site visits to provide information regarding errors and feedback on resolution.
38
Meeting Objectives
1. To understand the structure and purpose of Encounter Data Validation studies
2. To review the SFY 2014-15 EDV results and recommendations
3. To review ongoing efforts to improve encounter data quality
3939
Questions
SFY 2015-16 EDV Study: Next Steps
Study Design – July/August Data Request – September Data Collection – October Conduct Analysis
– Encounter Data File Review – November/December– Comparative Analysis – December - March– Medical Record Review – January - May
Reporting – May/June
Contact Information
Amy KearneyDirector, Research & Analysis [email protected]
Tom MillerExecutive Director, Research & Analysis [email protected]
Mary WileyDirector, State & Corporate [email protected]
Thank you!
Please take a moment to complete the webinar survey that will pop up when the meeting is
complete. We value and appreciate your feedback!