nca residency session 8 april 5 2017

24
AGENDA- Learning Collaborative Session 8 April 6, 3:00-4:30pm (EST) Team Report Outs Successes and Challenges, Recruitment updates, Questions for Faculty Position Offers/Contracts/Agreements Onboarding/Licensing/Credentialing Evaluation of the Program Preparing for Accreditation Action Period Items Monthly Reports Draft Contracts/Agreements Precepting Panel questions Next Session: May 3

Upload: chc-connecticut

Post on 22-Jan-2018

58 views

Category:

Healthcare


0 download

TRANSCRIPT

AGENDA- Learning Collaborative Session 8April 6, 3:00-4:30pm (EST)

Team Report Outs Successes and Challenges, Recruitment updates, Questions for Faculty

Position Offers/Contracts/Agreements

Onboarding/Licensing/Credentialing

Evaluation of the Program Preparing for Accreditation

Action Period Items Monthly Reports

Draft Contracts/Agreements

Precepting Panel questions

Next Session:May 3

Team Report Outs

Successes and Challenges, Recruitment updates,

Questions for Faculty

Offers, Contracts and AgreementsOFFERS:• Determine how and when to communicate offers. Offers and

declinations and are done by use of the ranking log

• Determine length of time for a decision- at CHC this is 48 hours.

• In the case of a “tie”, interviewers must discuss candidates and choose.

• Prepare for “back up offers” and a waiting list

Contracts and Agreements

• Immediately following the offer should be a formal employment contract.

• Determine method of delivery (electronic or direct mail) and length of time to return signed contract

• The contract can be a modified version of your organization’s existing employment contract. Items that may differ in the contract include:

• Term of the contract- 12 month residency program• Practice location • Salary• PTO• CME• Employment requirement post residency year- determine length of

commitment and subsequent year salaries.

NEXT STEPS:

Onboarding, tracking incoming resident credentialing, licensure certification material

Sample Contract

Licensing and Credentialing

• Offers have been made and accepted – start immediately!

• The process is a domino effect and timelines are short

• Follow your organizations general policy – adjust as needed

• Be prepared for delays based on states candidates come from

• Guide your candidates through the process and keep track of their status

Licensing and Credentialing

NP Residents 1. Sit for and pass boards2. Apply for state RN license 3. Apply for state APRN license4. Apply for state controlled substance license 5. Apply for federal DEA license

Post Doc Residents 1. Post docs are unlicensed and work under the

supervisor’s license.2. Verify that work under another’s license is a billable

service in your state. There is wide variability. In CT Husky (Medicaid) is billable but most private insurances are not

3. Be aware of licensing requirements in your state or state post doc wishes to seek licensure in and provide appropriate supervision and documentation

Onboarding

• In addition to licensing and credentialing process – Residents must be on boarded

• Leverage your HR department to help apply the organizations process for onboarding all new staff

• HR connects with Residents prior to start date and is also invited to orientation

• Residents are employees and their onboarding should look very similar

• We will cover orientation in more detail later!

Program Evaluation: Improving Performance

Presented by Candice Rettie, PhD NCA Webinar April 2017

Overview of the SessionDefinitions and Process of Good Program Evaluation

How to Design Meaningful Evaluation– Integrated Throughout the Program – Recruitment to

Graduation– Creates explicit expectations for trainee– Documents programmatic success– Fosters improvement positive growth, creativity and innovation

Characteristics of Useful Evidence

Learning ObjectivesKnowledge:

– Understand the purpose of evaluation– Know the characteristics of good evaluation– Understand the process of evaluation– Understand the connection with curriculum

Attitude:– Embrace the challenge– Value the outcomes

Skills– To be gained by independent / group work focused

on local training program

Definitions:Evaluation: systematic investigation of merit, worth, or significance of effort; Program evaluation: evaluate specific projects and activities that target audiences may take part in;Stakeholders: those who care about the program or effort.

Approach: practical, ongoing evaluation involving program participants, community members, and other stakeholders.

Importance: 1. Helping to clarify program plans;2. Improving communication among participants and partners;3. Gathering the feedback needed to improve and be accountable for program

outcomes/effectiveness;4. Gain Insight about best practices and innovation;5. Determine the impact of the program;6. Empower program participants and contribute to organizational growth.

1. Develop a Written Plan Linked to Curriculum2. Collect Data3. Analyze Data4. Communicate and Improve

4 Basic Steps to Program Evaluation

Fitting the Pieces Together: Program Evaluation

Program Curriculum

PreceptorFacultyStaff

Trainee

Institution

Overall Program

Program Evaluation Feedback Loops

Trainee performance

Instructor and staff performance

Program curriculum performance

Programmatic and Institutional performance

Evaluation Process How Do You Do It?

Steps in Evaluation:

Engage stakeholders

Describe the program

Focus the evaluation design

Gather credible evidence

Justify conclusions Analyze, synthesize and interpret findings, provide alternate explanations

Feedback, follow up and disseminate Ensure use and share lessons learned

Level 1: Reaction (Satisfaction Surveys) Was it worth time; was it successful? What were biggest strengths/weaknesses? Did they like physical plant?

Level 2: Learning (Observations/interviews) observable/measurable behavior change before, during, after program.

Level 3: Behavior (Observations/interviews) New or changed behavior on the job? Can they teach others? Are trainees aware of change?

Level 4: Results (Program Goals/Institutional goals) Improved employee retention? Increased productivity for new employees? Higher morale?

Kirkpatrick Model of Evaluation

• What will be evaluated?

• What criteria will be used to judge program performance?

• What standards of performance on the criteria must be reached for the program to be considered successful?

• What evidence will indicate performance on the criteria relative to the standards?

• What conclusions about program performance are justified based on the available evidence?

Questions Guiding the Evaluation Process

Basic Questions – Administrative Example

What? Postgraduate Training Program

Criteria?# of qualified applicants; # of trainees who remain with the program; ROI

Standards of Performance?# applicants; Half trainees hired at conclusion of year; On-boarding costs reduced; Billable hours increase w/ramp-up

Evidence?HR data / reports; Financials

Conclusions? Is the investment worthwhile?

Accuracy, Utility, Feasibility, Propriety

Anchored in the goals and objectives of the curriculum

Formative and summative

Use measurable and observable criteria of acceptable performance

Multiple, expert ratings/raters: Multiple observations give confidence in findings and

provides an estimate of reliability (reproducibility or consistency in ratings).

Conclusions need to be relevant and meaningful. Validity is based on a synthesis of

measurements that are commonly accepted, meaningful, and accurate (to the extent

that expert judgments are accurate).

Goals of Good Evaluation

Credible evidence -- Raw material of a good evaluation.Believable, trustworthy, and relevant answers to evaluation questions

Indicators (evidence)Translate general concepts about program and expected effects into specific, measurable parts. (eg: increase in patient panel / billable hours over 1 year)

SourcesPeople, documents, or observations (eg: trainees, faculty, patients, billable hours, reflective journals). Use multiple sources -- enhances the evaluation's credibility. Integrate qualitative and quantitative information -- more complete and more useful for needs and expectations of a wider range of stakeholders.

QuantityDetermine how much evidence will be gathered in an evaluation. All evidence collected should have a clear, anticipated use.

LogisticsWritten Plan: Methods, timing (formative and summative), physical infrastructure to gather/handle evidence.Must be consistent with cultural norms of the community, must ensure confidentiality is protected.

Learning ObjectivesKnowledge:

– Understand the goals and purpose of evaluation

– Know the characteristics of good evaluation

– Understand the process of evaluation

– Understand the connection with curriculum

Attitude:

– Embrace the challenge

– Value the outcomes

Skills

– To be gained by independent / group work focused on local training program

The Community Tool Box, (Work Group for Community Health at the U of Kansas): incredibly complete and understandable resource, provides theoretical overviews, practical suggestions, a tool box, checklists, and an extensive bibliography.

Pell Institute: user-friendly toolbox that steps through every point in the evaluation process: designing a plan, data collection and analysis, dissemination and communication, program improvement.

CDC has an evaluation workbook for obesity programs; concepts and detailed work products can be readily adapted to NP postgraduate programs.

Another wonderful resource, Designing Your Program Evaluation Plans, provides a self-study approach to evaluation for nonprofit organizations and is easily adapted to training programs. There are checklists and suggested activities, as well as recommended readings.

NNPRFTC website – blogs: http://www.nppostgradtraining.com/Education-Knowledge/Blog/ArtMID/593/ArticleID/2026/Accreditation-Standard-3-Evaluation

Resources:

Action Items

Action Period Items Monthly Reports

Draft Contracts/Agreements

Precepting Panel questions

Next Session:May 3