writing proposals with strong methodology and implementation kusum singh, virginia tech gavin w....

22
Writing Proposals with Strong Methodology and Implementation Kusum Singh, Virginia Tech Gavin W. Fulmer, National Science Foundation 1

Upload: rodney-mclaughlin

Post on 22-Dec-2015

217 views

Category:

Documents


0 download

TRANSCRIPT

Writing Proposals with Strong Methodology and ImplementationKusum Singh, Virginia TechGavin W. Fulmer, National Science Foundation

1

Goals

Encourage you to seek funding from NSF for your research.

Help you develop rigorous methodology, data collection and analysis plans that will make your proposal competitive.

Help you consider the level of detail appropriate for implementation projects.

2

Describing Your Project’s Methodology

3

Expectations for Methods in DRL

4 The DRL Programs welcome research using a variety of evidence.

The program is open to qualitative, quantitative, and mixed methods.

Methods must be rigorous and appropriate to the proposed research questions or hypotheses.

Design, methods, and analytic techniques should have a coherent and logical link.

Research methods should be described in adequate detail.

Details of Methods to Include – 1 Provide a rationale for your research design

Make it clear how the research design and analyses answer the research questions (RQs)

Include a description of study population and sampling method, sample size, expected effect size

Power analysis should inform sample size decision

5

Details of Methods to Include – 2 Instruments or protocols to be used

Validity, reliability, and triangulation of measures

Reviewers are cautious about development of new measures

Data analysis plans Statistical Models, procedures for analysis

of text/video/observation data All of these need to have a rationale for

them that connects to your RQs

6

Quantitative research

Research design (e.g. experimental, quasi-experimental and non-experimental designs, issues of internal & external validity)

Measurement (e.g. data to be collected, constructs, measures, validity & reliability of measures)

Data analysis (e.g. statistical decisions, models & procedures)

7

Qualitative Research

Identify the methodology as a systematic research design (e.g. case study, discourse analysis etc.)

Describe how and what data will be collected Consider issues of validity, and triangulation Include plans for analysis of textual data

(coding scheme, themes etc.) Find good balance between planned

approach to analysis and flexibility to respond to findings

8

Find the Expertise You Need

Content experts are not necessarily methods experts; so partner with research methodologists Sooner is better than later (in proposal

writing stage) Especially necessary if design is complex or

you use innovative methods Find a colleague

As co-PI or as consultant

9

Common Missteps in Methods -1 Overly generic language and description

“We will use constant comparative methods.” “We will use HLM.”

Lack of consistent link between the theory, the RQs, the data collected, and the analyses Reviewers will notice. Methods and planned analyses inadequate to

answer RQs. Try developing a matrix of RQs, data/measures,

and analyses – even if only for you during planning

10

Common Missteps in Methods -2 Too little or too much data without clear

analysis plan Reviewers will wonder if you understand

the task. Method is novel and not well understood

in field Needs more detail, examples and citations

to justify that it is appropriate

11

Summary of Main Points Articulate clearly your research questions or

research hypotheses Think about the most appropriate and rigorous

methods to answer your research questions Give a clear and concise description of the

research methods Include your rationale for research design

decisions Include a research methods expert in your team Articulate clearly why your research is important

and how it would contribute to theory and practice

12

Describing an Implementation

13

Details of Implementation

There are important implementation issues that need to be addressed if your project includes Curriculum development Professional development Interventions

14

For All Implementation Projects

15

Consider the method(s) used to gauge the quality of the implementation Whether as “Fidelity of Implementation” (FOI),

Intended/Enacted Curriculum, or other approaches

Be specific on the STEM content, ages/grades, settings

Be clear on the roles of the team Who will lead PD or curriculum, who will

oversee implementation? Who will collect evaluative data on

implementation?

Issues for Curriculum Development Specify the STEM content of interest and age

range(s) for which you are developing curriculum

Specify the role(s) of the PI team, outside experts, participating teachers, or others

Identify the process for development, revision, and field-testing Provide justification for the design process you

will use Make sure the measures match the

materials/curriculum under development

16

Issues for Professional Development Be specific on the professional development (PD)

STEM content, grades, and school settings Role(s) of the PI team, outside experts, participating

teachers, or others Format of professional development (e.g., online,

workshops) Duration and location of PD Evaluation

Identify the model for PD you will use Train-the-trainer Master teacher Professional Learning Community

Provide justification for the model, the format, and your team’s expertise

17

Issues for Intervention

Describe development history and its prior use Provide evidence, if any, for intervention’s

potential effects Describe in detail:

Population and sample; Setting, duration, and content; Design process, if the intervention will be

revised iteratively

18

Consider Generalizability19

If you are developing a new curriculum/PD model: How will the intervention, curriculum, or the

professional development developed in your setting apply to new settings that may differ from the study?

If you are applying an intervention, PD model, or curriculum adopted from another setting: How well does that intervention apply to your

setting? Will promising prior results be replicable in this

project?

Evaluation Plan20

Evaluation should be useful for improving the research project Design and content of the plan should be

appropriate to what would enhance or benefit the project

Formative or summative, internal or external may be appropriate, depending on the project.

For example, advisory committees are appropriate for the evaluation of projects.

Go to specific session on Project and Program Evaluation later in the conference for more details.

Don’t be shy.

Any Questions??21

Thank you!Feel free to contact Kusum Singh for follow-up and tips for finding a good methodologist:

[email protected]

22