designing scientific experiments

33
Designing Scientific Designing Scientific Experiments Experiments Dr. Gail P. Dr. Gail P. Taylor Taylor MBRS-RISE Coordinator MBRS-RISE Coordinator UT San Antonio UT San Antonio 08/2006

Upload: chloe

Post on 23-Jan-2016

33 views

Category:

Documents


0 download

DESCRIPTION

Designing Scientific Experiments. Dr. Gail P. Taylor MBRS-RISE Coordinator UT San Antonio. 08/2006. References. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Designing Scientific Experiments

Designing Scientific Designing Scientific ExperimentsExperiments

Dr. Gail P. Dr. Gail P. TaylorTaylor

MBRS-RISE CoordinatorMBRS-RISE Coordinator

UT San AntonioUT San Antonio

08/2006

Page 2: Designing Scientific Experiments

References

• CRITICAL THINKING, THE SCIENTIFIC METHOD, AND PAGE 25* OF GILBERT Dany S. Adams, Department of Biology, Smith College, Northampton, MA 01063 http://www.sdbonline.org/SDBEduca/dany_adams/critical_thinking.html#blurb

• Validity: http://carbon.cudenver.edu/~lsherry/rem/validity.html

• At the Bench, A Laboratory Navigator; updated edition. Kathy Barker, Cold Spring Harbor Press, 2005

Page 3: Designing Scientific Experiments

Scientific Method

• Observe phenomenon & conceive ideas

• Make predictions/develop a hypothesis

• Devise a test/formulate experiment

• Carry out experiments

• Draw conclusions from results

• Reject or support hypothesis

Page 4: Designing Scientific Experiments

Types of Experiments

• Science does not generally deal with facts, but rather with evidence

• Each experiment weakens or strengthens a hypothesis

• All evidence is not equal

• Try to discern cause and effect!

Page 5: Designing Scientific Experiments

Planning Experiments I

• What ideas have you come up with?

• Why is your idea important?

• Have other people tested this idea before?– http://www.pubmed.org

• What type of background information is available?

• Define question/Develop hypothesis (and null hypothesis)

Page 6: Designing Scientific Experiments

Determining Causality

• Causality can be difficult to prove• Different experimental designs impact differently

• Correlative Evidence (weak evidence)Found together in time or space

• Loss of Function (stronger evidence)– Blocked a phenomenon

• Gain of Function (strongest evidence)– Initiation of event leads to second event (additional function)

Page 7: Designing Scientific Experiments

Example – Protein X may be involved in Cellular Aggregation

• “Show it”• Correlative evidence (time and space):

– Antibody used to detect:• Found in particular microorganism when aggregating

(and not when free living)• Found in area where cells are contacting one another

during aggregation

– No causality; nothing beyond inference about function

• Clumping could cause the protein expression• Clumping and protein expression could be induced by

same causative agent• Could be completely coincidental

Page 8: Designing Scientific Experiments

Example – Protein X may be Necessary for Cell Aggregation

• “Block it”• Loss of Function - What does its loss do to

clumping?– Antibody to protein used to block it from

functioning. – Or knock out gene– Clumping no longer takes place

• Need controls- – Clumping specifically and only being inhibited– cells not dying– May support “real” clumping agent to function

– Therefore it is necessary for clumping

Page 9: Designing Scientific Experiments

Example – Protein X may be Sufficient for Cellular Aggregation

• “Move it”• Gain of Function

– In organism that does not normally clump…• Artificially introduce required protein• Or artificially turn it on at all times (constituitively

express)

– Aggregation now takes place

• Therefore is sufficient to induce clumping

Page 10: Designing Scientific Experiments

Progression to Necessary and Sufficient

• Often you will see this progression through Biological scientific papers– What is it?– Yes, it’s there– Yes, it’s in the right place– It’s loss produces this response– It’s addition produces this response…

Page 11: Designing Scientific Experiments

Planning Experiments II

• Consider statistical methodologies during planning stages

• Look in prior papers for ideas about statistics.• Statistical analysis will generally discern that

likelihood that a result occurred by chance• Consult mentor or statistician for confirmation

– Compare 1 treatment and control: t-test• Decide on p (Probability value) p < 0.05 or 0.01

– Compare many treatment groups: ANOVA– Many more…

Page 12: Designing Scientific Experiments

Planning Experiments III

• Variables– Independent (manipulated)– Dependent (outcome)

• # of samples (minimum 2, 3 better)

• # repetitions (minimum 2x)

Page 13: Designing Scientific Experiments

Internal Validity• Cause and Effect- Did the experimental treatment,

and only the experimental treatment, cause the effect!– Controls (Be Careful!!!)– Prevent additional variables from sneaking into your

experiment– Must control for:

• Selection: Anything that makes treatment and control groups different at beginning (random assignment)

• History: What different things may happen between expt. And control groups between initial treatment and measurement

• Maturation: Natural changes in subjects (aging)• Instrumentation: All tests/equipment/reagents must stay

same throughout experiment• Testing: “incoming” may “teach” the subject• Mortality: Subjects may leave or die (contamination)• Regression: If initial test scores were high, on average, will

naturally move towards mean

Page 14: Designing Scientific Experiments

External Validity• The extent to which the findings of the study can be applied,

reproduced, or generalized to another setting or systems. i.e., techniques to ensure that these groups correspond to general population

• Unrepresentative Sample: Sample members not representative of general population.

• Clear Description of the Treatment or Protocol (replicability)

• Hawthorne Effect: Subjects know that they are being studied and it influences behavior

• Novelty Effect: Particularly in humans…enjoy experiment, then possibly don’t.

• Pretest Sensitization: If the pretest is part of the treatment, it will obviously affect the results or findings.

• History and Treatment Interaction: something else happened which influenced results, for all participants

• Measurement of the Dependent Variable: Treatment and data collection must be the same every time!

Page 15: Designing Scientific Experiments

Types of Controls• Experimental

– Standards/calibration– Animal/Cell selection/care– Positive controls

• See what a positive response looks like and that it can be obtained. (positively expressing cells…)

– Negative controls• Shows what a zero response looks like

• Treatment controls– All groups treated identically except for indep.

Variable– If two treatments combined, show individual– All time points must be covered– Multiple samples

Page 16: Designing Scientific Experiments

Keeping it SimpleKeeping it Simple

• Your mentor wants to look at the time course effects of a possible cancer suppressor on proliferation and mRNA expression in six breast cancer cell lines. Wants to look at 0, 12, 24, 36, 48, 72, 96h

Page 17: Designing Scientific Experiments

The beauty of Small experiments….

• Mega Experiment– Ex: 6 types of cells, 7 time points, treated

and untreated (2), in triplicate (3).– 6x7x2x3 = 252 plates

• Plan Strategically and Break it down…– 1 cell line, treated and untreated, duplicate,

7 time points = 28. Or postpone duplicates.

Page 18: Designing Scientific Experiments

Results from Small Results from Small ExperimentsExperiments

• Low possibility for confusion• Reasonable workload• Reasonable use of resources• Ability to “assess as progress”

– Easy to interpret– Can change directions on fly

• Easy to create discrete graphs

Page 19: Designing Scientific Experiments

Chasing the Big Chasing the Big ProblemProblem

• For a Publication…– Need a Big Picture of what you are

pursuing; tell a good story– Start with correlation– Get additional information– Knock it out/Add it back/Overexpress

• Slight modifications, depending on field

Page 20: Designing Scientific Experiments

How to do Experiment – Obtain Protocol

• Instructions for carrying out a particular technique• If followed, will produce desired results• Best if it’s a proven protocol

– Designing your own is time-consuming– Obtain from another investigator

• In lab, best

– A book of protocols, from web, from kit• Will need fine-tuning for your local circumstances

– Methods section from published papers (least reliable)

Page 21: Designing Scientific Experiments

Review Protocol

• Read and do dry run-through

• May find logic gaps

• May find references to “common” procedures you do not know

Page 22: Designing Scientific Experiments

Personalize Protocol

• Rewrite (keeping same steps, etc) to make more sense to you.

• Add notes about own equipment required

Page 23: Designing Scientific Experiments

Fully Prepare before Experiment

• Buy all required materials– Radioisotopes

• Make all solutions and buffers

• Reserve machine time if needed

Page 24: Designing Scientific Experiments

Follow Protocol exactly, first time through

• If it doesn’t work, you can assume it’s you.

• Do again. Not work?

• Can get help from person who provided

Page 25: Designing Scientific Experiments

Modify Protocol

• Once protocol is working, modify.

• Make notations of changes

• Rewrite for next run though

• Good if type into computer…can record changes and re-print.

Page 26: Designing Scientific Experiments

During Experiment

• Record which media, temperature or date-sensitive agents you used

• Record any procedural deviation– Dropped something– Delayed– Calibration questions

• Put in lab notebook in a timely fashion

Page 27: Designing Scientific Experiments

Interpreting Results I

• Did the Experiment work?– Examine procedural (markers, cells lived)– Examine positive control (yes, antibody

working)– Examine negative control (No, did not have

everything come up positive)

Page 28: Designing Scientific Experiments

Interpreting Results II

• What were the results?– Compared to controls, did you see effect?– Graph your data– How big was effect?– Did effect vary over time?

Page 29: Designing Scientific Experiments

Interpreting Results III

• What does the experiment mean?– Do the results make sense?– Was the result what you expected?– Can you explain spurious results?– What additional controls may you need?

Page 30: Designing Scientific Experiments

Interpreting Results IV

• What do other investigators think?– Talk to lab members– Discuss results with someone versed in

technique– Run through background papers again– Repeat results

Page 31: Designing Scientific Experiments

Interpreting Results V

• Are the results repeatable?– Do experiment again– Add any additional controls

Page 32: Designing Scientific Experiments

Agh! It didn’t work!

1. Check notes. 2. Redo the experiment3. Focus on individual parts of expt.

1. + and – controls…

4. Do partial expt. to insure it’s fixed5. When you’ve done all…try again

several times6. If external protocol, may want to switch

Page 33: Designing Scientific Experiments

Switching Projects

• Never can reproduce data

• Project has little support from PI

• Direction of project has changed

• Not technically possible to do experiments well

• Project too difficult or involved