engaging students in environmental research projects: perceptions of fluency with innovative...
TRANSCRIPT
JOURNAL OF RESEARCH IN SCIENCE TEACHING VOL. 48, NO. 1, PP. 94–116 (2011)
Engaging Students in Environmental Research Projects: Perceptions of FluencyWith Innovative Technologies and Levels of Scientific Inquiry Abilities
Jazlin Ebenezer1 Osman Nafiz Kaya,2 Devairakkam Luke Ebenezer3
1299 College of Education, Wayne State University, Detroit, Michigan 482022Faculty of Education, Department of Science Education, Firat University, Elazig, Turkey3College of Business, Northern Caribbean University, Mandeville, Jamaica, West Indies
Received 30 December 2009; Accepted 12 March 2010
Abstract: The purpose of this mixed-method study was to investigate the changes in high school students’
perceptions of fluency with innovative technologies (IT) and the levels of students’ scientific inquiry abilities as a result of
engaging students in long-term scientific research projects focusing on community-based environmental issues. Over a
span of 3 years, a total of 125 ninth- through twelfth-grade students participated in this study. A project-specific Likert-
scale survey consisting of three parts (fluency with All Technologies, GPS/GIS, and CBL2/EasyData) was administered
to all students as a pre- and post-test. At the end of the study, 45 students were randomly interviewed and asked to
elaborate on the changes in their perceptions of fluency with IT. The results indicated statistically significant increases
(p< 0.001) in students’ perceptions of their fluency with IT. Qualitative analysis of students’ interview results
corroborated the statistical findings of students’ changes in perceptions of their fluency with IT. Students’ research papers
based on the environmental studies conducted at the interface of classroom and community were analyzed using the
Scientific Inquiry Rubrics, which consist of 11 criteria developed by the researchers. Results indicated the students’
abilities to conduct scientific inquiry for 7 out of 11 criteria were at the proficient level. This study clearly points to the
correlation between the development of IT fluency and ability levels to engage in scientific inquiry based on respective
competencies. Ultimately, this research study recommends that students’ IT fluency ought to be developed and assessed
concurrently with an emphasis on contemporary higher order scientific inquiry abilities. � 2010 Wiley Periodicals, Inc.
J Res Sci Teach 48: 94–116, 2011
Keywords: biology/life science; science teacher education; technology education; secondary
Most educational experts in the U.S.A., in principle, believe that ‘‘scientific inquiry is at the heart of
science and science learning’’ (National Research Council, 1996, p. 15). Science as inquiry is, in fact,
internationally aspired and focused (Council of Ministers of Education of Canada, 1997; Department for
Education and Employment, 1999). In a quest to put scientific inquiry as the soul of science education
practice, various technologies have been used in science curriculum. The recent forms of technology
incorporation into science curricula have been based on design principles. The built-in tools invirtual science
curricula are for ‘‘learning to learn’’ the scientific practices. For example,Model-It is a meta-cognitive e-tool
that enables students to represent and test their ideas through dynamic model building of science phenomena
and running simulations with their models to verify and analyze the results (Jackson, Krajcik, & Soloway,
2000).
To facilitate students in constructing and evaluating scientific explanations, Sandoval and Reiser (2004)
designed a technology-supported inquiry curriculum for the study of the natural phenomena of evolution and
natural selection. These authors concluded that an epistemic affordance especially designed for a particular
purpose can support that process in students’ scientific inquiry. Friedrichsen, Munford, and Zembal-Saul
(2003) used inquiry-empowering technologies (i.e., computer-based tools specially designed to support
Correspondence to: Jazlin Ebenezer; E-mail: [email protected]
DOI 10.1002/tea.20387
Published online 3 May 2010 in Wiley Online Library (wileyonlinelibrary.com).
� 2010 Wiley Periodicals, Inc.
scientific inquiry) to document individual prospective teachers’ understanding of science as argumentation.
These authors concluded that technological tools have the potential to challenge or reinforce prospective
science teachers’ perceptions ofwhat itmeans to learn science,what science is, andwhat characterizes school
science. Clearly, the foregoing design-based studies have added value to developing student understanding of
certain epistemic aspects of scientific inquiry.
Instead of the design-based virtual means for developing students’ understanding of scientific inquiry,
we launched the Translating InnovativeTechnologies intoClassroom (TITiC) project that enabled students to
work with innovative technologies (IT) in authentic research contexts. Our assumption was that the use of IT
in environmental research projects in real-world context would develop not only fluency with IT but also
scientific inquiry abilities. Thus, in the second phase of the TITiC project, participating teachers engaged
students in environmental research projects with IT to improve students’ use of technologies in scientific
inquiry.
This studywhich took place in schools as part of the TITiC project explores whether immersing students
in authentic environmental research projects using IT develops fluency in using those specific innovative
technologies. Based on the ITwe expected students to use in their research projects, we review the science
education literature pertaining to the goals of using IT in science research projects. Then, based on the
literature review, we develop a theoretical framework that reflects the use of ITand the standards of scientific
inquiry. Founded on these standards of scientific inquiry, rubrics are presented that assess students’ scientific
inquiry abilities as manifested in their research papers. The major focus of this study is to observe students’
changes in their perceptions of their IT fluencywhen these technologies were used in environmental projects.
Determining students’ level of attainment in scientific inquiry became the secondary purpose.
Scientific Inquiry with Innovative Technologies Studies
Enabling learners to use technology as a tool in conducting scientific inquiry is a National Science
Education Standard (NRC, 1996). There are two standards pertinent to the use of technologies in scientific
inquiry. Using a variety of technologies for investigation refers to the necessary tools (e.g., hand tools;
measuring instruments and calculators; electronic devices; and computers for the collection, analysis, and
display of data). The use of mathematical tools and statistical software refers to applying these to collect,
analyze, and display data in charts and graphs and to conduct statistical analyses. Closely aligned with these
scientific inquiry standards is one of the technology performance indicators—‘‘research and information
fluency’’ advocated by the National Education Technology Standards for Students (NETSS) (ISTE, 2008).
Each of the science and technology standards may be accomplished by various technologies. Critical to the
focus of the issue in this article is, however, the use of high-end technologies in scientific inquiry, which
includes calculator-based laboratory learning and the Global Positioning System (GPS) and the Geographic
Information System (GIS).
Calculator-based laboratories are hand-held computers connected to an interface box and a probe (e.g.,
temperature, pressure, pH, etc.) (Marcum-Dietrich & Ford, 2002; Schultz, 2003). In the calculator-based
laboratory experience, students gather real-time data in less time and spend more time analyzing graphical
representations of the data, questioning results, restructuring ideas, questioning the implications of their own
findings, and exploring new questions to investigate, ostensibly based on a newly introduced variable into the
experiment (Lapp & Cyrus, 2000; Schultz, 2003). The calculator-based laboratories overcome the barrier of
brief class periods for laboratory work because students can conduct their experiments in the time period
these call for. Because graphs are generated as students collect data, experimental design can be revised.
Students in different geographic locales have the opportunity to ask common questions and researchmethods
as well as share their data and interpretations.
Studies on computer-based laboratories (e.g., Adams & Shrum, 1990; Dori & Sasson, 2008) and, more
recently, hand-held computer connected to probes or calculator-based laboratory learning (Griffin & Carter,
2008; Kwon, 2002) have focused primarily on the effect of student graphing abilities. For example, Dori and
Sasson (2008) developed computerized laboratory with a focus on scientific inquiry and comprehension.
These authors investigated chemical understanding and graphing skills of 857 Israeli 12th chemistry honors
students in the computerized learning environment over 3 years. Assessment of students’ graphing and
chemical understanding-retention skills indicated significant improvement. More pertinent to the study at
STUDENTS IN ENVIRONMENTAL RESEARCH PROJECTS 95
Journal of Research in Science Teaching
hand is the work of Griffin and Carter (2008). These two researchers demonstrated that students were able to
use the portable data collection devices and tools to conduct scientific inquiry and engage in scientific
discourse related to the concepts of temperature and heat.
Complementing the latest probe technologies are the Global Positioning System (GPS) and the
Geographic Information System (GIS). The GPS is a U.S. space-based radio navigation system that provides
accurate location and time information for unlimited numbers of people in all weather, day and night,
anywhere in the world (National Space-Based Positioning Navigation and Timing Coordination Office,
2009). The GIS is ‘‘a computer-based system for managing, storing, analyzing, modeling, and visualizing
spatial information’’ (Zerger, Bishop, Escobar, & Hunter, 2002, p. 67). GIS technology is especially suitable
for studying the environment of a local community through data collection to a discussion of the watershed
and computerized mapping (Bednarz, 2004). Hess and Cheshire (2002) administered a pre- and post-
sequence survey to evaluate the effectiveness of their ‘‘problem-based learning approach’’ in studying the
forest basal areas using spatial information technologies and to determine how undergraduate students
perceived their learning. Students acknowledged that the integration of the GIS, GPS, and sampling
techniques helped in problems associated with measurements of natural resources, and understanding of the
application of the GIS, GPS, and sampling techniques. In Ramos, Miller, and Korfmacher’s (2003) study,
undergraduate students performed a common analysis of lead in sediment using atomic absorption
spectroscopy (AAS), and this research was incorporated into a GIS-based environmental assessment of
sediment deposition rates in a local pond. Student evaluations of the course at the end of the semester clearly
indicated that they preferred the problem-oriented approach to learning about heavy metal analysis of
sedimentation to the more traditional modes of instruction. The visual images generated through the GIS
analysis naturally led to a discussion of the watershed and extended the chemical analyses to land
management issues.
Calculator-based laboratories and Global Positioning System/Geographic Information System (GPS/
GIS) technologies are gradually becoming integral parts of the learning process and the ways learners
generate (construct), manage (represent), and communicate (validate) knowledge in science classrooms
(Bransford, Brown, &Cocking, 2000).With these innovative technologies, students can be creative problem
explorers and problem solvers. Together, ‘‘this combination of real-world investigations and interactive
visualizations helps students grasp the interrelationships of natural and human elements in their environments
and develop key concepts and inquiry skills’’ (Sanders, Kajs, & Crawford, 2001, p. 125).
Problem Statement
While the design and other technology research projects have focused onvarious ways IT can be used in
science learning, no study has examined students’ use of technologies in real-world research projects
designed to develop fluency with technologies and scientific inquiry. In addition, research has yet to be
published on whether students’ IT fluency pays dividends in terms of improving their scientific inquiry
abilities. And finally, no study as of yet has shown the development of scientific inquiry abilities in the context
of conducting authentic research with IT.
Although our present study highlights the importance of fluency acquisition related to IT, the primary
focus is not on acquiring technology skills per se but to enable students achieve relevant science learning
goals by increasing their IT fluency and their technological skills. Our assumption is that IT fluency acquired
through education in authentic learning contexts is an important part of enabling students to confidently use
technologies in research projects. Fluency with IT in our present study can be defined as demonstrating the
necessary confidence to appropriately use technology applications in empirical science or to explore real-life
problems. Developing IT fluency is a lifelong endeavor, and our work focuses on the beginning aspects of
high school learners’ IT fluency and how it might influence developing scientific inquiry abilities. This study
on the IT fluency and its influence on scientific inquiry abilities are, therefore, guided by two major research
questions:
(1) Is there a significant difference in students’ perceptions of fluency with innovative technologies
after their experience with long-term environmental research projects? Do students’ interpretations
of their experience support the changes in perceptions?
96 EBENEZER, KAYA, AND EBENEZER
Journal of Research in Science Teaching
(2) How do students’ research papers point to their levels of scientific inquiry abilities as a result of
using innovative technologies in environmental research projects?
This study is significant for several reasons. First, observing students’ changes in perceptions (pre- to
post-intervention) of their collective experience suggests that our work with teachers and, in turn, their work
with students, can provide a professional development model for those who intend to educate and train
students in innovative technologies. Secondly, critically analyzing students’ research papers using a set of
rubrics based on the standards of scientific inquiry illuminates their scientific inquiry abilities. Readily
discernible are the scientific inquiry criteria by which students’ skills can be classified as proficient,
developing, beginning, ormissing. This approach to determining students’ scientific inquiry abilities may be
more valid than surveying students (e.g., Lederman, 2004) about their scientific inquiry abilities. And finally,
this study reveals that immersing high school students in scientific inquiry about environmental issues in
order to develop IT fluency enables them to acquire scientific inquiry abilities. Hence, the context of IT
development might be a justifiable learning environment in which to develop scientific inquiry abilities.
Theoretical Framework: Technology-Embedded Scientific Inquiry
‘‘Science as inquiry’’ (NRC, 1996) is a concept that is not necessarily equivalent to the science processes
promoted in the 1960s and 1970s. Scientific inquiry combines ‘‘the use of processes of science and scientific
knowledge as they (students) use scientific reasoning and critical thinking’’ (p. 105). These processes of
science include contemporary higher order scientific inquiry abilities such as using scientific ideas to shape
research, applyingmathematical tools and statistical software, making evidence and explanation connection,
and communicating claims and arguments. These science inquiry abilities promote learning as conceptual,
social, and technological and involve the following three hallmarks of scientific inquiry: scientific
conceptualization, scientific investigation, and scientific communication (see Figure 1). Our assumption is
that conducting technology-embedded scientific inquiry within in-classroom and out-of-classroom settings
will have a significant impact on the development of students’ innovative technologies fluency and their
scientific inquiry abilities (NRC, 1996).
Scientific Conceptualization
Instruction in scientific conceptualization requires teachers to provide opportunities for students to test
and clarify conceptual ideas in ways that lead to a deeper understanding of subject matter, thus shaping the
way learners engage in the problem of inquiry. For example, knowledge scaffolding and integration
Figure 1. The technology embedded scientific
inquiry (TESI) model.
STUDENTS IN ENVIRONMENTAL RESEARCH PROJECTS 97
Journal of Research in Science Teaching
framework include the processes of eliciting ideas, making ideas visible, adding ideas, developing criteria,
and sorting out ideas (Davis & Linn, 2000). Templates can be created to provide scaffolding for students to
help them conceptualize their ideas and revise their work when completing complex activities (Kolodner,
Owensby, & Guzdial, 2004). Scientific conceptualization also involves engaging students in ‘‘investigative
empirical science with technology,’’ allowing students to conduct computer-simulated investigations,
modeling, and visualization (Bell & Trundle, 2008; Ebenezer, 2001). Dori and Sasson (2008) have
demonstrated that student creation of chemical compounds using molecular modeling software has enabled
them to compare their representations to those of their peers on the pathway to understanding physical and
chemical properties.
There are also additional ways of engaging in scientific conceptualization through the use of
technologies. For example, online concept maps have been useful for enhancing collaborative learning
because students are able to discuss and negotiate the relational links among concepts of the knowledge
network (Canas et al., 2001). Technology-enhanced Learning in Science (TELS) features a module that
focuses on the science concept of velocity and embeds scientific terms in the context of an interview (Tate,
2005). In a conversational style, a student exposes her understanding of science content and her knowledge of
science terms.Virtually, the student has to reveal that she knows how to use data to compute thevelocity of her
trip when she travels from a particular park to a movie theater to meet her friends. Discernible are many
creative ways of aiding students in the conceptualization of scientific knowledge through the use of IT.
While simulations such asmolecular motion and the process of hydration are used to help students learn
concepts that are abstract or invisible, students must be reminded that simulations are only models
representing the phenomena being studied. Theymust be encouraged to reflect personally and relationally on
the nature of scientificmodels and their applications in the construction of scientific knowledge (Flick&Bell,
2000). In short, while virtual learning has a role to play in science education, it does not fully engage learners
in themost beneficial form of scientific conceptualization unless they put forth effort in learning the scientific
model through personal reflection and collaboration. For example, Chang,Quintana, andKrajcik (2010) have
argued that allowing learners to design animations of the particulate nature of matter with simple tools and
engaging in peer evaluationwasmore effective at improving student learning than designing animationswith
no peer evaluation. In other words, scientific conceptualization via virtual models should accompany
collaborative thinking.
Scientific Investigation
Instruction in scientific investigation involves scaffolding students in the following critical abilities: (a)
formulating researchable questions or testable hypotheses; (b) demonstrating logical connections between
scientific concepts guiding a hypothesis and the design of an experiment; (c) designing and conducting
scientific investigations; (d) using measurement instruments; (e) using mathematical tools and statistical
software to collect, analyze, and display data in charts and graphs; (f) recognizing how investigation itself
requires clarification of research questions, methods, comparisons, and explanations; and (g) weighing
evidence using scientific criteria to find explanations and models (NRC, 1996). The Global Learning and
Observations to Benefit the Environment (GLOBE) program provides a good example of ways that these
abilities can be fostered in a technology-embedded environment (see http:/ /www.globe.gov/). In this
program, students are immersed in conducting authentic science investigation. Students take scientifically
valid measurements, analyze data, and report data through the Internet; publish their research projects based
on GLOBE data and protocols; create maps and graphs on a free interactive Web site; analyze data sets; and
collaborate with scientists and other GLOBE students around the world. These projects demonstrate the
importance of scientific investigation as an embedded component of scientific inquiry and also model the
importance of the development of students’ IT fluency and their scientific inquiry abilities.
Scientific Communication
While scientific conceptualization and scientific investigation continue to comprise the two primary
pillars of science education, a third pillar provides an important and often overlooked aspect of science
education: scientific communication. Instruction in scientific communication involves students in
communicating research processes, research results, and knowledge claims via classroom discourse
98 EBENEZER, KAYA, AND EBENEZER
Journal of Research in Science Teaching
and public presentation, often including a critical response from peers and experts. Innovative
communication technology tools incorporate computer-based scaffolds to either support or refute
competing theories by constructing valid yet opposing arguments from multiple perspectives in response
to issues. This new emphasis on scientific communication represents a fundamental shift from teaching
science as ‘‘exploration and experiment’’ to teaching science as ‘‘argument and explanation’’ (NRC, 1996,
p. 113).
Clearly, this shift has had an impact on how the scientific educational communities perceive the role of
communication in science endeavors. For example, transformative communication as a cultural tool for
guiding inquiry science (Polman& Pea, 2001), teaching science through online peer discussions (Hoadley&
Linn, 2000); and emphasizing computer-mediated reasoned argumentation have been successful in creating
communities of enquirers who value knowledge communication as much as knowledge creation (Clark &
Sampson, 2008; Ebenezer & Puvirajah, 2005). As an example of this shift in emphasis, Ebenezer and
Puvirajah (2005) conducted a study on the argumentation styles of middle school students’ about the
particle theory of matter. The study used WebCT discussion boards and resulted in three general categories
of dialogues that can occur in science learning (experiential, referential, provisional). Students’
presumptive reasoning consisted of ‘‘argument from sign,’’ ‘‘argument from cause to effect,’’ ‘‘argument
from evidence to hypothesis,’’ and ‘‘argument from position to know.’’ Although students’ arguments were
presumptive in nature, their communication with peers and teachers demonstrated that it is more than a
passive activity in scientific endeavors; rather, it is an active force that shapes the way science inquiry is
perceived.
Closely related to the study at hand, Liang, Ebenezer, andYost (2010) have discussed the characteristics
of pre-service teachers’ discourse on a WebCT Bulletin Board in their investigations of local streams in an
integrated mathematics and science course. A qualitative analysis of data revealed that the pre-service
teachers engaged in collaborative discoursewhen framing their research questions, conducting research, and
writing reports. The science teacher provided feedback and carefully crafted prompts to help pre-service
teachers develop and refine their work. Overall, the online discourse formats enhanced out-of-class
communication and supported collaborative groupwork. But the discourse on the critical examination of one
another’s viewpoints rooted in scientific inquiry appeared to be missing. This absence suggested that pre-
service teachers should be given more guidance and opportunities in science courses when engaging in
scientific discourse that reflects reform-based scientific inquiry.
In summary, contained within the three hallmarks of scientific inquiry are contemporary higher order
scientific inquiry abilities. The three hallmarks of scientific inquiry are: scientific conceptualization,
scientific investigation, and scientific communication. Scientific conceptualization involves the identification
of and development of deeper understanding core science concepts that can be used to shape scientific inquiry.
Scientific investigation involves the development of science education standards-based inquiry skills such as
framing a relevant research question, evaluating design, using mathematical knowledge and representations,
making event-evidence-explanation connection, and communicating knowledge claims. Scientific
communication involves the sharing of ideas with respect to research questions, methods, and claims for
peer response and evaluation meeting objectivity from a social perspective. A variety of technology
applications effectively promote the hallmarks of scientific inquiry. Empirical studies above have used
technologies in the science contexts of teaching, learning, and inquiry—with a science objective inmind that
contributes to scientific literacy (AAAS, 1993; Rutherford&Ahlgren, 1989). Our focus in this study, instead,
is the development of fluency with innovative technologies (and any secondary outcomes) when students are
engaged in authentic environmental research projects.
MethodsContext of Inquiry
The aim of the TITiC project was to educate and train high school science teachers (9–12 grades) to use
IT when conducting empirical science in their classrooms by first engaging them in specially designed
summer institutes; assisting them in the study of the Lake Erie ecosystem with their students using
investigation andmeasurement tools; andfinally, helping them to infuse technologies into their curricula. The
three phases of the TITiC project were as follows.
STUDENTS IN ENVIRONMENTAL RESEARCH PROJECTS 99
Journal of Research in Science Teaching
Phase 1: Teacher Professional Development in Innovative Technologies to Promote Student Capacity
Building. The TITiC project is founded on the notion that the key to supporting students in learning school
science and conducting community-based science research projects is first building science teacher capacity
in innovative technology within authentic research contexts (Knezek & Christensen, 2002). When teachers
immerse students in innovative technologies embedded in a context of scientific inquiry, it is expected that
students will become fluent in IT and acquire scientific inquiry abilities.
Based on the above premises, each year, in Phase 1 of the TITiC project (a specially designed 2-week
summer institute) 15 teachers, 5 from three school districts, through a participatory approach, were taught the
capabilities of the following technologies: All Technologies include the Internet, computer database, web
page, power point presentation, analytical hardware such as the spectrophotometer and digital titrator;Global
Positioning Systems (GPS) and Geographic Information Systems (GIS) devices; and Texas Instrument TI
84þ graphing calculators; and Calculator-Based Laboratories (CBL2s) with various Vernier sensors and/or
Labpro interface unit—the LoggerPro software areas. The application of these technologies was taught
within the authentic context of the Lake Erie watersheds at the Knabusch Mathematics and Science Center,
Bolles Harbor, MI. At the end of the summer institute, the external evaluator and his team asked teachers to
rate their preparedness to use specific technologies on a scale of 1 (notwell prepared) to 4 (well prepared). The
average scores of the ratings of teacher preparedness in various technologies for 2005, 2006, and 2007 follow
consecutively: 3.39 for the Logger Pro Software, 3.44 for Vernier probes, 3.36 for the GPS, 3.47 for theWater
Test kit, 3.51 for the Spectrophotometer for analyzing water quality, and 2.54 for the GIS (SAMPI, 2005,
2006, 2007). Except for GIS-related technologies which require a high learning curve, all the other
technologies crucial to science teaching and learning received high ratings. According to Bednarz and Van
der Schee (2006), a reason for the steep learning curve of the GIS software is that it has high technical
demands and its functions are not designed for teaching and learning. This poses a challenge for teachers to
learn the GIS software within a short period of time. The positive results and outcomes of other technologies
are outlined above to establish the fact that the TITiC project lends support to the conclusion that the teachers
are now better able to use innovative technologies in environmental research projects and, perhaps most
importantly, to teach their students how to use these technologies.
Phase 2: Teachers Engage Students in Research Projects. At the beginning of each school year, the
TITiC teachers taught their students how to use innovative technologies to conduct scientific inquiry. The
Phase 2 of the TITiC project was possible only because teacherswere given a set of technologies that they had
learned in the summer institute to be used in their own schoolwith a small group of students. Then the teachers
engaged these students in semester- or year-long scientific research projects that were related to community-
based environmental issues. At the end of Phase 2, the TITiC students presented their research papers tomore
than 100 participants at the Student Research Symposium held in May of each year from 2005 to 2008 at a
Regional Educational Service Agency (RESA). Their peers, teachers, and administrators, as well as the
members of the TITiC management and National Advisory Group were present at these symposia. At least
three groups of students from each school delivered paper presentations, and other groups constructed poster
presentations. Moreover, students from one school created their own website to share their research. Two
schools published students’ research projects in their respective journals The Bolles Harbor Journal and The
Southgate Journal. All of these publications are available to the public online through school Web pages.
(Please see Appendix A for a list of research projects.)
Phase 3: Teachers Integrate Innovative Technologies into Classrooms. During Phase 3, teachers were
expected to infuse TESI into their regular science curricula and show evidence of their work. Some teachers
showed evidencevia video-recordings of their various activities that employed IT.Others invited the research
team to observe their classes. For example, a 9th grade teacher integrated IT into a unit on ecology. Abiotic
factors covered in this unit included temperature, pH, water quality, pollution, and sunlight. Students used a
variety of the previously described technologies to test hypotheses on the difference between these abiotic
factors invarious areas of the campus (i.e., ‘‘Is there a difference in pHbetweenwater in theLakeErie channel
and the 2 ponds’’). The unit concluded by focusing on photosynthesis and cellular respiration as an
explanation for the flow of energy through ecosystems. Students used technology in a series of experiments
100 EBENEZER, KAYA, AND EBENEZER
Journal of Research in Science Teaching
designed to measure changes in carbon dioxide levels of closed systems based on manipulating a variety of
variables such as (a) the amount of sunlight provided to a plant and (b) the effects of temperature on cold-
blooded organisms.
Sample
The sample consisted of a total of 125 high school students from 9th to 12th grades (age 14–18) over a
period of 3 years (2005–2008). Therewere 45 students in the program the first year, 53 students in the second
year, and 27 students in the third year. The decline in student participation in the third year is because
classrooms in one schoolwere being remodeledwith the facilities beingmoved to another site. Thus, teachers
in this school did not have access to the technologies the TITiC project provided for conducting
student research projects. Also, because of re-visioning of the Detroit Public School System, in an African-
American school the TITiC teachers were placed in other schools. These African-American teachers who
left their original school did not have access to the needed technologies for engaging students in research
projects.
In terms of grade levels, 63 9th grade students, 31 10th grade students, 20 11th grade students, and 11
12th grade students took part in the study. The ethnicity of the students who took part in this research was as
follows: White American—91%; African American—5%; and Other—4%. These student participants
represented variations (http:/ /www.schoolmatters.com/) in geography (southeast), context (urban—5
schools, sub-urban—4 schools, and rural—2 schools); economy (economically disadvantaged to
economically advantaged—1 school); and ethnicity/race (Caucasian—7 schools and African-American—
4 schools). According to the 2006 data of the schools who took part in the TITiC project, Grade 12 Science
MEAPaverage ranged from22% (low)–80% (high) and student economic level (sel) ranged from4% (rich)–
72% (poor). The two major minority groups were African American and Arab-American.
The students who participated in this particular research study are from two urban schools (1 African-
American—MEAP average 22% and sel 72%, 1 Arab-American–MEAP average 59% and sel 30%,) three
sub-urban schools (Caucasians–MEAP 80% and sel 4%; MEAP 41% and sel 23%), and one rural school
(Caucasian–MEAP 50% and sel 34%).
Procedures and Research Design
This study employed a one-group pre-test–post-test design (Campbell & Stanley, 1963). To identify
students’ pre-perceptions of their IT fluency, the surveys were administered each year to all students as a
pretest at the beginning of Phase 2 of the TITiC project. The post-test was administered each year at the end of
Phase 2, when the students had finished their IT-embedded environmental research projects. Also, at the end
of Phase 2, semi-structured individual interviews were conducted with 45 randomly selected students to
explore their IT fluency.
Quantitative Data Collection and Analysis
Fifty-one items were developed and incorporated into a Likert-type survey to measure students’
perceptions of IT fluency by mining the assessment database of the Education Development Center, Inc.,
Boston, MA and by further researching the characteristics of various IT. The survey was then reduced to 42
items and organized around three sub-scales after three university science educators and two IT experts
examined the content validity of each item. The first sub-scale consisted of 15 items focusing on students’
general IT fluency, focusing on technologies students may have used when they worked on their long-term
research projects. The second sub-scale consisted of 18 items that were related to students’ fluency with GIS
andGPS technologies. The third sub-scale consisted of nine items and focused on students’ fluency using the
CBL2 interface with the EasyData program. For all items on these scales, students were asked to rate their
responses to the following statement on a 4-point scale (1¼ not fluent, 4¼ very fluent).’’ The option ‘‘Not
applicable’’ was included in the survey to accommodate students who did not have any experience with
technologies indicated in the survey items, and this optionwas anchored by 0.Alpha reliability coefficients of
the sub-scales were found to be 0.84, 0.88, and 0.80. A paired-sample t-test for each individual sub-scale for
pre- and post-tests was used to evaluate the impact of conducting environmental research projects with ITon
STUDENTS IN ENVIRONMENTAL RESEARCH PROJECTS 101
Journal of Research in Science Teaching
the students’ perceptions of fluency in IT. To control for the possible inflated Type I error rates due tomultiple
t-tests, the p-value was adjusted to 0.0167 (0.05 divided by the number of t-tests).
Qualitative Data Collection and Analysis
Qualitative datawere collected through interviews and students’ scientific research papers. At the end of
the study, individual interviews were carried out with 45 randomly selected students in order to explore their
IT fluency. Students were asked general questions about their experiences using IT, such as the following:
What research did you do? What kinds of technologies did you use in your research? How did you use these
technologies? Do you think you are better now than before in using technology because of your experience in
this project? Give me an example. What was your experience in using technology in your research? Each
interview lasted about 30minutes. All interviews were audio-recorded and transcribed verbatim. Interview
data were used to confirm or disconfirm the statistical data.
Students conducted their environmental research studies in small groups of two to four. Each year, each
school was allowed to present no more than four papers. Therefore, the teacher decided which groups should
present papers based on the significance of the group’s topic and work and the completion of the paper. We
collected 38 research papers (year 1¼ 19, year 2¼ 10, and year 3¼ 9) written by 110 students that were
presented at the Student Research Symposium,RESAeach spring over 3 years. The Scientific InquiryRubrics
consisting of 11 abilities was used for assessing the quality of each paper (see Appendix B).
Creation of Rubrics
To assess the quality of student papers, reform documents (NRC, 1996, 2000) were reviewed to
formulate scientific inquiry criteria which involve contemporary higher order scientific inquiry abilities.
Through extensive discussion over several days, we initially outlined 15 ideal scientific inquiry criteria. But
not all criteria seemed to be fruitful. Becausemost of the students’ paperswere not experimental in nature and
because some scientific inquiry criteria never appeared in students’ papers, we eliminated four full criteria
and a few phrases from thewording of two criteria. The four criteriawe eliminated included the following: (1)
define variables and constants that guide the experimental design; (2) analyze alternative explanations and
select the plausible one by reviewing scientific understandings, weighing evidence, and examining logic; (3)
formulate, support, elaborate, revise, or reject explanations and models using scientific knowledge and
logical reasoning and evidence from investigation; and (4) use technologies for communication. Original
phrases we left out of two criteria were: (1) formulate a testable hypothesis (revealing relationship between
variables) and propose explanation(s) to answer the question; and (2) defend scientific arguments (and
respond to critical comments) connected with investigation, evidence, and scientific explanation. Although
we eliminated these phrases from the original rubrics, teachers and science teacher educators should
emphasize all 15 scientific inquiry criteria to help students conduct effective scientific inquiry and write
research papers that reflect scientific credibility and quality. Finally, 11 scientific inquiry criteria were
formulated by examining the scientific inquiry standards and students’ research papers.
Conceptualization of Scientific Inquiry Criteria
In this section, we briefly explain all 11 scientific inquiry criteria we included in our rubrics. An
interesting observation is how the criteria may be grouped into three hallmarks of scientific inquiry as
discussed in the theoretical framework: scientific conceptualization (criteria 1–4), scientific investigation
(criteria 5–9), and scientific communication (criteria 10–11) that promote contemporary higher order
scientific inquiry abilities.
Criterion 1: ‘‘Define a scientific problem based on personal or societal relevance with need and/or
source’’means that students ought to identify and accurately define a community-based problem that
ismeaningful to them.Theproblemmust havepersonal or societal relevance. Students should defend
the problem based on the need for the study or because they have identified the problem from a
reliable source.
Criterion 2: ‘‘Formulate a statement of purpose and/or scientific question’’ means students should
write the purpose and state a scientific question with clarity and precision.
102 EBENEZER, KAYA, AND EBENEZER
Journal of Research in Science Teaching
Criterion 3: ‘‘Formulate a testable hypothesis and propose explanation(s)’’ means students should be
able to state a hypothesis that lends itself to testing. Also, the hypothesis should be accompanied by
coherent explanation(s).
Criterion 4: ‘‘Demonstrate logical connections between scientific concepts guiding a hypothesis and
research design’’ means that students should identify the scientific concepts and create a conceptual
system that will guide the hypothesis and research design.
Criterion 5: ‘‘Design and conduct scientific investigations related to the hypothesis’’means that students
should logically outline methods and procedures, use proper measuring equipment, heed safety
precautions, and conduct a sufficient number of repeated trials to validate the results.
Criterion 6: ‘‘Collect and analyze data systematically and rigorously with appropriate tools’’ means
students should collect and analyze data accurately using appropriate tools, methods, and
procedures.
Criterion 7: ‘‘Make logical connections between evidence and scientific explanation’’ means students
should connect evidence from their investigations to explanations based on science theories.
Criterion 8: ‘‘Use a variety of technologies for investigation’’ means that students should use the
necessary tools (e.g., hand tools; measuring instruments and calculators; electronic devices; and
computers for the collection, analysis, and display data).
Criterion 9: ‘‘Use mathematical tools and statistical software’’ means students should use these to
collect, analyze, and display data in charts and graphs and to conduct statistical analyses.
Criterion 10: ‘‘Communicate through scientific paper for replication and enhancement’’ means that
students should be able to write a clear scientific paper with sufficient details so that another
researcher can replicate or enhance the methods and procedures.
Criterion 11: ‘‘Defend scientific arguments connected with investigation, evidence, and scientific
explanation’’ means students should defend scientific arguments based on the event or problem that
they are studying, evidence they have gathered, and the scientific theory. In other words, students
should see the relationships among events, evidence, and explanations to make and defend a
scientific argument.
Analysis of Students’ Research Papers
Each research paper was critically analyzed, and scores were assigned for each criterion using the
Scientific Inquiry Rubrics (see Appendix B) in order to identify students’ scientific inquiry ability levels:
proficient, developing, beginning, ormissing (see Table 1). Inter-rater reliability of students’ research papers
was performed by one of the researchers and one external expert independently using the above rubric.
During the analyses of the research papers, when discrepancies arose across individual analyses, the
researcher and expert reviewed the relevant papers together, discussed discrepancies in analyses, and reached
consensus on the students’ scientific inquiry abilities based on their research papers. The researcher and
expert reached 93% agreement on the interpretations.
To best explain how we derived the scores for each criterion and assigned various levels (as represented
in Table 1), and because of space limitations, we provide evidence from research papers for only criterion 11.
For Criterion 11, ‘‘Defend scientific arguments connected with investigation, evidence, and scientific
explanation,’’ 4 research papers received a score of 3 (proficient), 17 research papers received a score of 2
(developing), 15 research papers received a score of 1 (beginning), and 2 research papers received a score of 0
(missing).
Table 1
Numbers of research papers based on each level of scientific inquiry abilities in each criterion
Levels
Scientific Inquiry Criteria
1 2 3 4 5 6 7 8 9a 10 11
Proficient (3 points) 33 27 26 17 15 20 10 28 10 15 5Developing (2 points) 3 9 8 17 19 14 22 7 15 16 16Beginning (1 point) 1 2 2 4 3 4 4 2 6 6 15Missing (0 point) 1 0 2 0 1 0 2 1 2 1 2
aFive research papers were assessed as not applicable.
STUDENTS IN ENVIRONMENTAL RESEARCH PROJECTS 103
Journal of Research in Science Teaching
TheLongTerm Study on theEffects of Road Salt andVolume on the Salt LoadLevels of BeanCreek, from
which the following excerpt was taken, was scored at the ‘‘proficient’’ level. Scientific argument was indeed
evident in the research paper, and it was connected with hypothesis, investigation, evidence, and scientific
explanation. Evidence of these elements is underlined and within brackets in the following excerpt:
One reason that the chloride levels did not significantly increase [investigation] is because of the
massive increase of volume [explanation] in Bean Creek. At the downstream site, there was a 7,401%
increase in the water that flows through the stream per second [evidence], which means that the
increase of stream flow was apparently diluting the chloride levels [scientific explanation] of the sites.
The salt load levels of the downstream site significantly increased 6,980% once salt was applied to the
roads, which supports my hypothesis that they would increase, but I believe the significant increase is
caused by the significant increase in volume instead of the salt applied to the roads [alternate
explanation].
The student authors/researchers of the Water Quality at Ives Road Fen Preserve study defended
scientific arguments reasonably well by connecting them with investigation, evidence, and explanation,
indicating a ‘‘developing’’ level of proficiency. The ammonium, chloride, conductivity, hardness, and
phosphorus levels of the groundwater were discussed with possible explanations using terms such as ‘‘think’’
and ‘‘feel.’’ Arguments were not connected to evidence based on numerical values or their t-test values. For
example, the following test exemplifies the five reasons these student authors/researchers gave based on five
tests for the foregoing variables. Concerning the chloride test, students stated thus:
We think we got these results for chloride because of the well locations. TNC 1 and 3 are by the golf
course so the chemicals from the grass are running into the fen and polluting the wells.
In the pH Level in Precipitation study, students collected rainwater and snow from November to March
and measured the pH level of the water to determine whether it was acidic. They defended their results
through scientific arguments weakly connected to evidence and scientific explanation, indicating a
‘‘beginning’’ level of proficiency. For example, they stated two reasons why there would be a greater level of
acidity in thewinter months when they tested the rainwater samples, but their results actually showed amuch
higher rainwater acidity level than the other months tested. Their reasons for the increase in the acidity level
were as follows:
. . .Thus, it is conclusive that people in Southgate use more fossil fuels during the winter months.
Furthermore, Southgate is so close to Detroit that all the pollution from the ‘Motor City’ is mixed in
Southgate’s atmosphere, consequently making our overall air pollution worse already.
These students explained scientifically why increasing the fossil fuels would cause air pollution.
However, without adequate explanation, these students simply stated that Detroit causes air pollution in
Southgate because of the mix of pollutants from both places.
The Surrounding Rivers Impact on Lake Erie and ITWatershed study was classified into the ‘‘missing’’
level of proficiency. Students simply repeated the results instead of defending arguments based on the
problem that they studied and evidence they gathered:
. . .The Detroit, Huron, and Rouge Rivers showed to have the least amount of bacteria; while the
Raisin Rive proved to have the largest amount. The phosphor results have just enough nutrients for the
water supply but do not contain an excessive amount that could be harmful to the life of the
environment. The water clarity varied depending on location. For example, Wyandotte the Detroit
River only had a reading of 4 compared to Dearborn Heights at the Rouge River with a reading of 28,
which proved to be cloudier.
Results and Discussion
Based on the two major research questions, the results section consists of two parts: The first part
discusses changes in students’ perceptions of their IT fluency based on the comparison of pre- and post-test
104 EBENEZER, KAYA, AND EBENEZER
Journal of Research in Science Teaching
survey results. The second part discusses students’ acquisition of scientific inquiry abilities as reflected in
their research papers as a result of having engaged in environmental projects with IT.
Students’ Perceptions of Their Fluency in Innovative Technologies
Table 2 indicates changes in high school students’ perceptions of their IT fluency based on the statistical
analyses of the three components of the pre- and post-test Likert survey: (a) Fluency with All Technologies;
(b) Fluency with GPS/GIS; and (c) Fluency with the CBL2 interface with the EasyData program. All results
show statistically significant increases (p< 0.001) in students’ perceptions of their IT fluency after their
engagement and experiencewith community-based environmental research projects. Interview data are used
to support the significant change in each of the foregoing components.
All Technologies. In terms of student fluency in ‘‘All Technologies’’ that they used when completing
their projects, Table 2 indicates that there was a significant increase in the scores from pre- (M¼ 26.42,
SD¼ 7.22) to post- (M¼ 42.03, SD¼ 9.61) because of the intervention (students’ engagement in and
experienceswith environmental research projects) [t(124)¼ 15.58, p< 0.001]. For example, in the interview,
64% of students reported that the TITiC project had impacted their Internet fluency at a ‘‘high level.’’ This
outcome was expected because students used the Internet extensively to study background information
related to their research problems. One student made the following comment: ‘‘During this project, I did a lot
of research for background information. I used some articles from local papers, but found that it was much
easier and efficient to use the Internet for research.’’ Another student stated the purpose for which the Internet
was used and provided an example: ‘‘We also used the Internet to learn about our topic. Internet was very
useful to understandwhat exactlywe had to do to go for and understand the implications of dissolved oxygen,
fish anatomies, how their metabolisms work.’’
GPS/GIS. The GPS/GIS survey results in Table 2 indicate that the post-test scores (M¼ 27.54,
SD¼ 8.88) were statistically better than their pre-test scores (M¼ 16.70, SD¼ 7.61) because of the
intervention [t(124)¼ 12.66, p< 0.001]. In the interview 13% and 24% of students reported that the TITiC
Project had impacted their fluency in GPS/GIS to a ‘‘high level,’’ and an ‘‘accepted level,’’ respectively.
When studentswere asked to explain the changes in their fluency inGPS andGIS in the interview,we realized
that most students were good at using the GPS but not the GIS. The students particularly stated that theywere
very good at using aGPS unit to pinpoint a location, to navigate to a given set of coordinates, and to bringGPS
data intoArcViewGIS. The fact that students weremore proficient at theGPS than theGIS is substantiated by
the following excerpt:
We wanted to mark the type of the samples of plants and trees around our school using GPS unit and to
analyze and make a map of the trees around surroundings with the GIS. We were going to make a field
guide for the elementary school kids about these plants. We did not have a real opportunity to learn
how to use technology before this project. We are definitely better than earlier. I spent three days
talking on the phone how to use GIS after looking at the book. It is horrible. But I learned it finally.
With GIS software, you have to follow each dot exactly and perfectly. Otherwise it does not work. GPS
is pretty easy to use itself to mark the points. First I felt uncomfortable when hearing using the
technology. I thought I can’t figure how to do this. But, it is easier now. We like using it a lot more than
the first time.
Table 2
The descriptive statistics and t-test results of the fluency surveys
ScalesNo. ofItems N
Pre-scoresMEAN (SD)
Post-ScoresMean (SD)
MeanDifference t(124)
All Technologies 15 125 26.42 (7.22) 42.03 (9.61) 15.61 15.58*
GPS/GIS 18 125 16.70 (7.61) 27.54 (8.88) 10.84 12.66*
CBL2/EasyData 9 125 10.86 (5.77) 22.28 (6.97) 11.42 14.97*
*p< 0.001.
STUDENTS IN ENVIRONMENTAL RESEARCH PROJECTS 105
Journal of Research in Science Teaching
This studentwas a representative ofmany students who attempted to learn theGPS andGIS. The student
together with his classmates learned to use the GPS by identifying trees around the school andmapping them
using the GIS ArcView software. This student found the GPS to be easy when marking the points. However,
the student felt overwhelmed to map the various trees simply by looking at the GIS book. But his persistence
to learn the GIS technology had him spend 3 days on the phone. He found learning of the GIS program to be
‘‘horrible.’’ He admitted that if each dot is not followed ‘‘exactly and perfectly,’’ the GIS will not work. The
student claimed that hewas uncomfortablewith theGIS andnever thought he could figure it out. Likewise, the
teachers also attested that learning the GIS was very difficult. At the end of the summer institute, average
scores of teacher preparedness to teach GPS and GISwere 3.36 and 2.54, respectively, which are comparable
to students’ fluency of the same technologies. In the TITiC project there were several obstacles to learning
GIS. The teacher training inGISwas crowded out by the other events of the TITiC project’s summer institute.
The GIS curriculum material used in the summer institute was not science education oriented. The
observation of the teacher behavior in the summer training indicated the unwillingness of teachers to learn
and use the GIS—a reason has been pointed out by Bednarz (2004) also. Perhaps, teachers do not care for this
innovative tool of learning because of the difficulty in using GIS software and the lack of time for teachers to
learnGIS and use it in class (Kerski, 2003). Obviously, students are affected by the lack of teacher preparation
in and attitudes toward the GIS.
The analysis of students’ responses to the CBL2/EasyData survey indicated that there was a significant
increase in the scores from pre- (M¼ 10.86, SD¼ 5.77) to post- (M¼ 22.28, SD¼ 6.97) because of the
intervention [t(124)¼ 14.97, p< 0.001]. In the interview 44% and 38% of students reported that the TITiC
Project had impacted their fluency in CBL2 to a ‘‘high level,’’ and an ‘‘accepted level,’’ respectively.
Importantly, students expressed that theywere able to design laboratory activities that incorporated the use of
EasyData to collect, analyze, interpret, and transfer data. For example, consider what one student stated in the
interview:
I am much better when it comes to using technology after this (IT) experience. I was able to do all of
the tests that were needed such as turbidity, nitrates, phosphates, and pH. In the beginning I had never
even heard of some of them and by the end I was doing them in a short amount of time. Now I am able
to just pick up a probe and go through the directions and have no trouble at all at calibrating them and
using them. I like using the probes and sensors . . . I thought it was funny when I seen a temperature
probe at my doctors office . . .Like hey I know what that is.
Referring to the probes and sensors, one student stated that he had never evenheard of such technologies.
After the IT experience he was able to use all the water testing probes. He was also adept at using these IT
because hewas able to use them in a shorter period of time.He also stated that he is able to pick a probe and use
it. He can now relate what he used in his classroom with what is being used in the hospital setting. Students’
changes in perceptions ofCBL2, in fact, correlateswith teachers’ preparedness in the use ofCBL2 (3.39/4.0).
This finding clearly suggested that the teacher fluency of IT has a direct impact on students’ IT fluency.
Comparisons of the students’ pre-test mean scores represented in Table 2 (with maximum values of
60.00, 72.00, and 36.00 for each scale, respectively) showed that the students perceived their pre-fluency in
technologies at a rate of 44%, 23%, and 30%. This means that the students’ pre-perceptions of fluency in
GPS/GIS were lower than their pre-perceptions of fluency in CBL2/EasyData and especially compared
to All Technologies such as the Internet, computer database, web page, power point presentation,
spectrophotometer and digital titrator.
When comparing each scale’s mean pre- and post-test scores (see Table 2), the percentage increases of
mean values with respect to students’ fluency in All Technologies, GPS/GIS, and CBL2/EasyData were
59%, 65%, and 105%, respectively. Although statistical findings revealed significant increases for all three
scales, these percentage increases clearly showed that the most positive change in students’ perceptions of
fluencywas related toCBL2/EasyData. The comparison of students’ post-testmeanvalues (with amaximum
score for each scale) showed success rate of 70%, 38%, and 62% for All Technologies, GPS/GIS, and CBL/
EasyData scales, respectively. These values indicate that the students weremore fluent in their abilities to use
All Technologies and CBL2/EasyData than they were in their abilities to use GPS/GIS.
106 EBENEZER, KAYA, AND EBENEZER
Journal of Research in Science Teaching
Students’ Development of Scientific Inquiry Abilities
TheTITiCproject objectivewas to develop the IT skills of teachers so that they, in turn,will build student
fluency with IT by engaging students in environmental research projects. A student in our project described
the link between the use of IT and the design and doing of scientific inquiry in the following manner:
The technology that our teacher taught shaped the research questions because we had to use them in
research projects. We selected this project because we wanted to use the water quality probes. We used
the GPS to select three spots to take water samples. Our group also used the Internet to gather
background knowledge and to find out about people who lived around River Huron so that we can talk
to them about the river. We used the GIS ArcView to map the GPS locations and to decide on the levels
of the river.
This excerpt suggests that students selected their environmental research project based on IT they
wanted to learn. They wanted to use the probes to determine water quality of the Huron River. They used the
Internet to gather background knowledge of the river and to find out about the people who lived in the areas
surrounding the river so that they could talk with them about the river. They used the GIS tomap the data they
collected based on the GPS locations. Based on students’ research experience, aside from determining the
water quality of the Huron River, one can assume that they learned how to use the IT instruments and also
improved in their scientific inquiry abilities. Thus, to become fluent in using ITand cultivate scientific inquiry
abilities, students need to be immersed in technology-embedded scientific inquiry.
Although the TITiC project was not necessarily focused on developing students’ scientific inquiry
abilities as reflected in their research papers as a result of having engaged in environmental projects with IT,
wewere curious to find out whether conducting environmental research projects with IT had any influence on
the development of scientific inquiry abilities. As mentioned in the methodology section, we originally
outlined 15 scientific inquiry criteria based on scientific inquiry standards (NRC, 1996) to assess students’
research papers to determine their scientific inquiry abilities as reflected in their research papers as a result of
having engaged in environmental projects with IT. But as we analyzed their papers, four criteria seemed to be
completely and conspicuously absent. Based on their papers, we were able to apply 11 criteria. Although
there is much to learn about developing students’ scientific inquiry abilities in terms of scientific inquiry
standards, their research papers based on the scientific criteria indicated gave ample evidence that students
had developed in 7 of the 11 scientific inquiry abilities as reflected in their research papers as a result of having
engaged in environmental projects with IT.
While the Scientific Inquiry Rubrics (see Appendix B) were used to critically analyze all 38
research papers (see Appendix A), the students were not explicitly made aware of the scientific inquiry
standards in their research projects because this was not the focus of the TITiC project. However, the
process of qualitatively analyzing and classifying information in 38 research papers into the categories of
‘‘missing,’’ ‘‘beginning,’’ ‘‘developing,’’ and ‘‘proficient’’ as shown in Table 1 in the Methods Section;
converting them into scores; and finding the mean values for each criterion was revealing (see Figure 2 and
Table 3).
Mean scores of students’ scientific inquiry ability for each criterion in the rubric were computed. The
mean values were interpreted using the following guide: Missing (0–0.74); Beginning (0.75–1.49);
Developing (1.50–2.24); and Proficient (2.25–3.00). As represented in Figure 2 andTable 3, themeanvalues
in the 2.25–3.00 range indicate that the students’ abilities to conduct scientific inquiry according to seven
criteria, that is, 1, 2, 3, 4, 5, 6, and 8, were at the proficient level. It should also be noted that themeanvalues of
criteria 1, 2, 3, and 8 are higher than 2.50, which indicates that students’ scientific inquiry abilities reached
extremely high levels. These criteria include defining a scientific problem based on personal or societal
relevance with need and/or source (M¼ 2.79, SD¼ 0.62), formulating a statement of purpose and/or
scientific question (M¼ 2.66, SD¼ 0.58), formulating a testable hypothesis and proposing explanation(s)
(M¼ 2.53, SD¼ 0.83), and using a variety of technologies for investigation (M¼ 2.63, SD¼ 0.71). Mean
values of the remaining four criteria (7, 9, 10, and 11) in the rubric werewithin the range of 1.50–2.24, which
showed that the students achieved the level of ‘‘developing.’’ Students’ meanvalue of 2.32 for overall criteria
was at the ‘‘proficient level’’ considering themean score of 2.25.Aswell, the comparison of the students’ total
average score of 25.42 with the rubrics’ maximum value of 33.00 showed a success rate of 77%.
STUDENTS IN ENVIRONMENTAL RESEARCH PROJECTS 107
Journal of Research in Science Teaching
Except for criterion eight (see Table 3), which has to do with IT, students’ levels of proficiency for
criteria 1–6 might be attributed to their teachers’ past science instructional practice and/or the discussion of
scientific inquiry standards with teachers in each summer institute. Teachers were given handouts on
scientific inquiry standards and how they can be applied to research. Teachers were encouraged to pay
particular attention to the standards when students were engaged in environmental research projects with IT.
Current emphases in scientific inquiry according to the national body of science educators focus on
criterion 7 (evidence and explanation connection); criterion 9 (the use of mathematical tools and statistical
Figure 2. Mean scores of students’ scientific inquiry abilities based on eleven criteria.
Table 3
Means and standard deviations of students’ scientific inquiry abilities based on eleven criteria, including the overall
mean and total average scores
Scientific Inquiry Abilities Average (N¼ 38)
1 Define a scientific problem based on personal or societal relevance withneed and/or source
2.79 (0.62)
2 Formulate a statement of purpose and/or scientific question 2.66 (0.58)3 Formulate a testable hypothesis and propose explanation(s) 2.53 (0.83)4 Demonstrate logical connections between scientific concepts guiding a
hypothesis and research design2.34 (0.67)
5 Design and conduct scientific investigations related to thehypothesis – methods and procedures are logically outlined; propermeasuring equipment are used; safety precautions are heeded; andsufficient repeated trials are taken to validate the results
2.26 (0.72)
6 Collect and analyze data 2.42 (0.68)7 Make logical connection between evidence and scientific explanation 2.05 (0.77)8 Use a variety of technologies for investigation 2.63 (0.71)9 Use mathematical tools and statistical software 2.00 (0.87)10 Communicate through scientific paper for replication and enhancement 2.18 (0.80)11 Defend scientific arguments connected with investigation, evidence, and
scientific explanation1.63 (0.79)
Mean for all criteria 2.32 (0.34)Total average score 25.24 (4.57)
108 EBENEZER, KAYA, AND EBENEZER
Journal of Research in Science Teaching
software); criterion 10 (communication); and criterion 11 (argumentation). These are new dimensions of
epistemology of science that promote contemporary higher order scientific inquiry abilities. Students scored
lower in these areas, particularly on criteria 11 (M¼ 1.63, SD¼ 0.79), compared to all other criteria.
Students’ mean scores on these criteria suggest that teachers have not acquired skills in these areas and thus
these skills are not in their repertoire of pedagogical content knowledge in science.
Among scientific criteria 7, 9, 10, and 11, we paid considerable attention to criterion 9, ‘‘the use of
mathematical tools and statistical software,’’ because the TITiC year-1 students’ research papers showed a
lack of statistical analysis. They had not calculated even themean scores of their experimental values. Hence,
starting from year 2, simple statistics (e.g., paired and independent samples t-tests, ANOVA, and correlation)
were taught to teachers during the summer institute so that they, in turn, could teach these simple statistics to
their students. Also, the graduate research assistants and the TITiC project’s postdoctoral scholar visited
schools to help teachers with statistical analysis in scientific inquiry. This extra attention to professional
development focused on statistics began to show effect on students’ research in year 2.
In fact, we found that the mean scores of the first-year students’ scientific inquiry abilities were lower
than those of the second- and third-year students for criteria 7, 9, 10, and 11. For example, mean scores of the
first-year students’ scientific inquiry abilities were 1.95, 1.56, 1.84, and 1.53, while the mean scores of
second- and third-year students together were 2.16, 2.41, 2.53, and 1.73 for criteria 7, 9, 10, and 11,
respectively. When these mean values for criteria 7, 9, 10, and 11 were compared, there was a noticeable
increase for criterion 9—the use of mathematical tools and statistical software—from the first year to the
following years since therewas a 55% increase in favor of the second- and third-year students. During the past
3 years of the project, we havewitnessed much improvement in teacher knowledge of statistics and students’
application of statistics in their research. In future projects, we need to attend to the ‘‘developing’’ and the
neglected dimensions of scientific inquiry that are of higher order.
Implications
In most research projects, primary focus is on student exploration of a particular scientific issue.
Technology use and/or learning to use technology and understanding of scientific inquiry criteria within the
hallmarks of conceptualization, investigation, and communication are rarely brought to the foreground in
research. Technology and science education reformers are constantly asking educators to pay attention to the
use of technologies and understand scientific inquiry criteria by highlighting respective standards. The
purpose of the TITiC project was to develop IT fluency by engaging students in hands-on issue-based
environmental research. Although students participated in real-world environmental projects that had
personal and/or societal relevance and some indeed studied and contributed to existing science-related
research in parks, nature conservancies, and the NOAA, the intention of this participation was not to focus on
the awareness of the problem or problem-solving techniques but to help teachers and students become more
proficient with the IT being used. However, students went through the experience of researching real-life
issues and benefited from the mentoring process and/or from their previous science learning experience.
Either way, the use of innovative technologies in the first place allowed students to conduct high-end
technology-related projects and learn to use IT. As a result, two corollaries are apparent: (1) conducting
scientific inquiry that employs IT leads to increased IT fluency; and (2) IT fluency within research projects
influences scientific inquiry abilities.
Conducting Scientific Inquiry with IT Leads to Increased IT Fluency
During the summer institute, the TITiC project intentionally selected the study of environmental issues
to develop teacher skills in using innovative technologies. Teachers, in turn, engaged students in scientific
inquiry with IT by focusing on environmental issues (see Appendix A for students’ research projects). The
environmental research topics students selected with the assistance of teachers is a reflection of the various
innovative technologies that they had learned to use. For example, students involved in The Surrounding
Rivers Impact on Lake Erie and its Watershed project used LabPro, DataMate program, turbidity sensor, pH
sensor, Phosphate Accu Vac, Nitrate Accu Vac, stopwatch, TI graphing calculator, GPS unit, GIS ArcView
program and Hach spectrophotometer. At each testing location where the rivers flowed into Lake Erie,
students used the GPS to ‘‘plot the coordinates where thewater was taken from.’’ The GPS also allowed them
STUDENTS IN ENVIRONMENTAL RESEARCH PROJECTS 109
Journal of Research in Science Teaching
to ‘‘view aswell as compare the distance between test sites.’’ Once each river samplewas collected, ‘‘theGPS
points were plotted onto an ArcView map.’’ Then the testing of each water sample began for pH, turbidity,
phosphates, and nitrates. For example, theVernier Turbidity Sensor allowed students to determine the overall
murkiness of the water of the surrounding rivers. While students determined the water quality for murkiness
of the surrounding rivers of Lake Erie, they were, indeed, learning to use the turbidity sensor. Similarly, they
had had practice with the pH sensor. While gathering data in real time and using Texas Instrument TI 84þgraphing calculators to construct graphs, students in the river study learned to use the various technologies.
This research project demonstrates how students learned to use the innovative technologies in authentic
research contexts in studying real environmental problems. It is, therefore, not surprising that students
changed their perceptions of IT fluency by the end of their immersion in research projects when the post-
intervention survey was conducted. Because change in perceptions of fluency with IT were consistently
increased, conducting scientific inquirywith IT in order to understand and assess environmental issues seems
to provide a fertile ground for increasing proficiency with IT.
While the focus of Lapp andCyrus (2000) andSchultz’s (2003) research studies is on learning to conduct
science investigationswith IT, our study focused on the reverse—learning to use ITwhile conducting science
investigations. Similarly, while Ramos et al. (2003) studied how to integrate environmental analysis and
assessment by implementingGIS technology in the chemistry laboratory, our study conducted environmental
research projects in order to foster learning about how to use the GPS and GIS. In order to use IT in research,
immersing students in the study of real-life problems is important.
IT Fluency Development within Research Projects Influences Scientific Inquiry Abilities
Often studies in science learning focus on the use of relevant IT in conducting investigations that attempt
to solve environmental problems. For example, Ramos et al. (2003) in their GIS-based project focused on the
problem of lead in the sediment of a local pond. In the foregoing study, unlike our study, the educational aim
was neither the technology used (i.e., GIS) nor the development of understanding of scientific inquiry.
Perhaps, these aims may have been the natural spin-off in the inquiry of the problem in the foregoing study.
In our study, as a result of using innovative technologies such as the GPS/GIS, students conducted
scientific inquiry of environmental issues that were relevant to and compatiblewith using IT. Because of such
a learning environment to develop IT fluency, the natural outcome was student development in scientific
inquiry criteria as reflected by their research papers. Thus, we are able to make claims in this study how
becoming fluent with IT has supported student scientific inquiry in their environmental research projects.
Student engagement in research projects to learn the use of IT and writing scientific papers has provided us
with insights into their understanding of the various criteria of scientific inquiry. This process also has helped
us to understandwhat scientific criteria students achieved andwhat they did not achieve.Wehave also learned
about the assessment of students understanding of scientific inquiry.
Although the outcome of IT fluency in our study was also the development of scientific inquiry abilities,
it is likely that more can be achieved if teachers intentionally help students become more cognizant of the
meaning behind each scientific inquiry criterion. In our study, teachers seemed to have attended to those
scientific inquiry criteria that theywere alreadywell-versedwith. Teachers’ scaffolding of students inwriting
scientific research papers reflected students’ achievement of 7 of the 11 criteria. Mentoring teachers to
develop student abilities while conducting research projects with forms of scientific thinking as reflected by
the 11 plus criteria is important. In particular, teachers need epistemic affordances to cultivate students’
scientific thinking based on the following two scientific criteria: criterion 7—‘‘Make logical connections
between evidence and scientific explanation,’’ and criterion 11—‘‘Defend scientific arguments connected
with investigation, evidence, and scientific explanation.’’ Teachers should be mentored to facilitate student
recognition of the relationship among event, evidence, and explanation to make and defend a scientific
argument. To attend to the foregoing epistemic aspect of scientific inquiry, real context inquirywith IT should
be augmented with design tools in the virtual platform (Friedrichsen et al., 2003; Sandoval & Reiser, 2004).
Teachers should learn to scaffold students’ thinking on the credibility and quality of scientific inquiry criteria,
particularly the ones that were neglected. Hence, professional development of teachers in the neglected
criteria of scientific inquiry is crucial so that they may engage their students in technology-embedded
scientific inquiry, which attends to sound criteria as characterized by the new standards (NRC, 1996, 2000)
110 EBENEZER, KAYA, AND EBENEZER
Journal of Research in Science Teaching
and science educators’ new insights (Duschl & Grandy, 2008). Learning to use the IT to monitor and assess
the environmental issue aswell as understanding all of the scientific inquiry criteria are essential in a research
project.
Implications Pertaining to Limitations of the Study
The limitations of this study points to three viable implications. They are as follows: translatability to
diversity, access to technology adept teachers, and access to technologies.
Translatability to Diversity
In each of the 3 years (2005–2008), all TITiC schools were encouraged to take part in this study (see
demographics of the sample). However, teachers who administered the surveys were from one urban school,
consisting primarily of Arab-Americans, and three sub-urban schools and one rural school consisting of
Caucasians. Although schools consisting of African-American students participated in the TITiC project,
they were not involved in our research activities. Hence, the study results on student IT fluency and
achievement in scientific inquiry abilities can be translated to schools that would consist primarily of the
Caucasian race and Arab-American population regardless of their locale—urban, sub-urban or rural and
regardless of their social class—rich, middle or poor. The non-participation of African-American schools in
research activities poses a limitation in translating our work to African-American student population across
the US. This limitation needs to be addressed in future strategic grant activities, perhaps by embedding
research questions and activities directly into evaluation of the project, where all participating teachers and
students are required to take part. In the TITiC project, in-depth research was separate and no incentives were
provided to teachers and students for research participation.
Access to Technology Adept Teachers
The IT fluency of participating students reached proficient levels because they were educated by
technology adept teachers. Students do not achieve because students do not have access to adequately trained
teachers. For example, 60% of physical science students are not taught by qualified teachers (The National
Commission on Mathematics and Science Teaching for the 21st Century, 2000). If we want to educate our
students to face the technological skills of the 21st century, we need to work on the premise: ‘‘What teachers
knowand can do is themost important influence onwhat students learn’’ (inWhatMattersMost, TheNational
Commission on Teaching and America’s Future, 1997, pp. 6–8).
The specially designed 10-day summer institute engaged teachers in applying IT to authentic issues
related to theLakeEriewatershed and the surrounding environment.Additionally, the teacherswere provided
with necessary background information related to Lake Erie ecosystem that shaped their research projects
and methods to collect and analyze field data, and prepare for student research projects. During the school
year, the TITiC teachers engaged a small group of students in environmental research projects and
technology-experts mentored them during the research process. For example, the evaluator’s report (2009) to
theNational Science Foundation stated that the TITiC teachers hadmentioned that their knowledge of various
information technologies increased, including scientific sampling, spectrophotometer-based water analysis,
hand-held devices (probes) for data collection/analysis, and calculator-computer interfacing. Because of
effective modeling and demonstration of technology integration (Lambert, Gong, & Cuper, 2008) within the
TITiC project, teachers were empowered to use technologies in their classrooms. Students’ success with IT
fluency was because teachers were adept in IT. Hence, if the nation aspires that our students should be
successfully using technologies, then we need to improve teachers’ own technical expertise and professional
experiences by overcoming the barriers that deter them from gaining expertise and experience (Office of
Technology Assessment, 1995). Unless teachers have intensive professional development like the TITiC
projectwith specific learning goals for students such as ITfluency and scientific inquiry abilities, studentswill
not develop knowledge and skills in these areas. Teachers’ lack of adequate technology training in the use of
technologies is still considerable (Yildirim, 2009) and thus projects like the TITiC that impacts students’ IT
fluency and scientific inquiry abilities should be used asmodels for teacher education and training. The TITiC
teachers were comfortable in using technologies because of access to technologies and sufficient practice for
the reasons alluded by Levin and Wadmany (2008). Thus, the teachers were able to transfer technological
STUDENTS IN ENVIRONMENTAL RESEARCH PROJECTS 111
Journal of Research in Science Teaching
skills and strategies that they had learned to students’ research projects as Ertmer (2005) and Glazer and
Hannafin (2006) point out. Bransford et al. (2000) succinctly states, ‘‘When teachers learn to use new
technology in their classrooms, theymodel the learning process for students; at the same time, they gain new
insights on teaching by watching their students learn’’ (p. 226). This was reflected by student IT fluency and
scientific inquiry abilities. The student attainment of learning goals was as a result of the following reason:
‘‘Indeed, a truism for effective professional development is that it should mirror the approaches teachers are
being asked to enact with their students’’ (Wiske et al., 2001, p. 484).
Access to Technologies
Teacher and student access to the appropriate technologies (hardware and software) is another area of
conversation in terms of the implications of this study. In order to conduct environmental research projects,
the TITiC project participants required probes, calculators, water quality test kits, associated software, GPS
hardware and GIS software licenses. These hardware and software were provided by the National Science
Foundation with a goal to build teacher capacity and develop student literacy in IT. The equipment provided
was adequate for each TITiC teacher to work with a small group of students. When the teachers attempted to
integrate technologies into their curriculum or engage several teams of students in scientific inquiry, they
encountered challenges because they required multiple sets of technologies. The schools that participated in
our research study had access tomultiple sets and computers/laptops. For example, as soon as the teachers in
the affluent school district that took part in the study realized the power of using these in scientific inquiry,
purchased several sets of the tools they needed. The rural school already had multiple sets in their research
center. Although Schools are promoting the use of high-end technologies for laboratory learning and
increasingly providing access to these (Metcalf & Tinker, 2004; Parr, Jones, & Songer, 2004), there are
thousands of schools that have a huge financial burden. Even schools involved inNSF-funded TITiC program
felt the financial pinch when they attempted to use technologies with more students in their classrooms. This
study has clearly demonstrated that those students who had access to technologies had benefited with respect
to IT fluency and cultivating scientific inquiry abilities. Thus the technologies we used in the project need to
be perceived as essential for scientific inquiry if we need to develop IT literacy and scientific inquiry abilities.
If so, financial investment should be made by schools for making technologies accessible to teachers and
students despite the stringent budget in which the schools are operating.
The research reported is being undertaken as part of the project Translating Innovative
Technologies into Classrooms (TITiC): Student-Teacher Scientific Research in Lake Erie
Water Sheds. This study funded by the NSF-ITEST-TITiC under Project No. ESIE 0423387
is gratefully appreciated.
Appendix A: List of Research Projects
(1) Erie Marsh Monitoring Program: Frogs and Toads.
(2) Age and Length Correspondence of Various Species of Fish on Campus.
(3) Invasive Species Control on Campus.
(4) Munson Park Prairie Restoration Project.
(5) Nesting Birds on Campus.
(6) River Raisin Water Quality.
(7) Surface Temperature Studies on Campus.
(8) pH Level in Precipitation.
(9) Emerald Ash Borer Destruction in relation to Tree Placement.
(10) The Surrounding Impact on Lake Erie and its Watershed.
(11) The Study of Soil Content.
(12) Fecal Coliform Bacteria in the Lower Ecorse Creek and Detroit River.
(13) Dissolved Oxygen Related effects of Thermal Pollution on Aquatic Life in the Lake Erie
Watershed.
(14) Water Comparison Between the Detroit and Huron Rivers.
(15) Marina Pollutants along the Detroit River.
(16) A study of pH levels in Lakes, Rivers and Pounds surrounding the Lake Erie Watershed.
112 EBENEZER, KAYA, AND EBENEZER
Journal of Research in Science Teaching
(17) Seasonal Changes in Chloride Concentrations and Total Dissolved Solids in the Lower Ecorse
Creek/Detroit River.
(18) AquaticMacro-invertebrates Communities as Indicators of Pollution in the Frank and Poet Creek.
(19) Native Species Field Guide.
(20) The Effect of Light on the Growth of Big Bluestem.
(21) The Effects of Road Salting on Water Quality.
(22) Macroinvertebrates in the River Raisin.
(23) A Comparison of Our Fish Tanks and Pond.
(24) Invasive Species on Campus.
(25) GLOBE Surface Temperature Study.
(26) DTE Prairie Restoration Monitoring.
(27) Erie Marsh Nature Preserve Marsh Monitoring Program.
(28) Nesting Birds on Campus.
(29) Ives Road Fen Groundwater Monitoring Project.
(30) The Effects of Chlorine on Fish.
(31) War of the Plants.
(32) Study of Spider Web Structures.
(33) Water Quality at Ives Road Fen Preserve.
(34) Macroinvertebrates in the River Raisin.
(35) Long-Term Study on the Effects of Road Salt and Volume on the Salt Load Levels of Bean Creek.
(36) The Effect of Road Salt on Macro Invertebrates.
(37) Pond and Fish Tank Comparison.
(38) Tree Height vs. Circumference.
Appendix B: Scientific Inquiry Rubrics
ScientificInquiry Abilities Missing (0 Point)
Beginning(1 Point)
Developing(2 Points)
Proficient(3 Points) NA Total
1. Define a scientificproblem based onpersonal or societalrelevance with needand/or source
Defines no scientificproblem withneed and/or source
Defines scientific problemimproperly; withoutdefensible statementof need and/or source
Defines scientific problempartially accurate;with either thedefensible statementof need or source
Defines scientific problemaccurately; withdefensible statementof need and source
2. Formulate a statementof purpose and/orscientific question
Formulates no statementof purpose and/orscientific question
Formulates statementof purpose and/or ascientific questionwith no clarity
Formulates statement ofpurpose and/orscientific questionwith partial clarity
Formulates statement ofpurpose and/or scientificquestion with clarity
3. Formulate a testablehypothesis andproposeexplanation(s)
Formulates no testablehypothesis andproposeexplanation(s)
Formulates testablehypothesis andpropose incoherentexplanation(s)
Formulates testablehypothesis andproposes partiallycoherentexplanation(s)
Formulates a testablehypothesis and proposescoherent explanation(s)
4. Demonstrate logicalconnections betweenscientific conceptsguiding a hypothesisand research design
Demonstrates no logicalconnections betweenscientific conceptsguiding a hypothesisand research design
Demonstrates improperconnections betweenscientific conceptsguiding a hypothesisand research design
Demonstrates partialconnections betweenscientific conceptsguiding a hypothesisand research design
Demonstrates logicalconnections betweenscientific conceptsguiding a hypothesisand research design
5. Design and conductscientificinvestigations relatedto the hypothesis—methods and proceduresare logically outlined;proper measuringequipment are used;safety precautions areheeded; and sufficientrepeated trials are takento validate the results
Designs and conductsscientific investigationrelated to thehypothesis—methodsand procedures arenot logically outlined;no proper measuringequipment are used;not heeding to safetyprecautions; norepeated trials
Designs and conductsscientific investigationrelated to thehypothesis—Themethods and proceduresare outlined but difficultto follow; Usingmeasuring equipmentcarelessly; heeding tosafety precautionscarelessly; trials areinsufficient to testhypothesis
Designs and conductsscientific investigationrelated to thehypothesis—Themethods and proceduresare outlined but notlogically sequenced;Using measuringequipment with somecare; pays someattention to safetyprecautions; evidenceof repeated trials totest hypothesis
Designs and conductsscientific investigationrelated to thehypothesis—Themethods and procedureslogically outlined;good use of measuringequipment; pays closeattention to safetyprecautions; andrepeated trials aresufficient to validatethe results
6. Collect and analyze datasystematically andrigorously withappropriate tools
Collects and analyzesno data
Collects and analyzesdata with errorsand/or gaps
Collects and analyzesdata with minorinaccuracies
Collects and analyzesdata with accuracy
STUDENTS IN ENVIRONMENTAL RESEARCH PROJECTS 113
Journal of Research in Science Teaching
References
American Association for the Advancement of Science. (1993). Benchmarks for scientific literacy: A tool for
curriculum reform. Project 2061. New York: Oxford University Press. 448 p.
Adams, D.D., & Shrum, J.W. (1990). The effects of microcomputer-based laboratory exercises on the acquisition of
line graph construction and interpretation skills by high school biology students. Journal of Research in Science Teaching,
27, 777–787.
Bednarz, S.W. (2004). Geographic Information Systems: A tool to support geography and environmental education?
GeoJournal, 60, 191–199.
Bednarz, S.W., & Van der Schee, J. (2006). Europe and the United States: The implementation of geographic
information systems in secondary education in two contexts. Technology, Pedagogy and Education, 15(2), 191–205.
Bell, R., & Trundle, K. (2008). The use of a computer simulation to promote scientific conceptions of moon phases.
Journal of Research in Science Teaching, 45, 346–372.
Bransford J.D., Brown A., & Cocking R. (Eds.), (2000). How people learn: Mind, brain, experience and school.
Washington, DC: National Academy Press.
Campbell, D.T., & Stanley, J.C. (1963). Experimental and quasi-experimental designs for research. Boston:
Houghton Mifflin.
Canas, A.J., Ford, K.M., Novak, J.D., Hayes, P., Reichherzer, T., & Suri, N. (2001). Online concept maps: Enhancing
collaborative learning by using technology with concept maps. The Science Teacher, 68(4), 49–51.
Chang,H.,Quintana,C.,&Krajcik, J.S. (2010). The impact of designing and evaluatingmolecular animations onhow
well middle school students understand the particulate nature of matter. Science Education, 94(1), 73–94.
Clark, D.B., & Sampson, V. (2008). Assessing dialogic argumentation in online environments to relate structure,
grounds, and conceptual quality. Journal of Research in Science Teaching, 45(3), 293–321.
Council of Ministers of Education of Canada. (1997). Common framework of science learning outcomes. Toronto:
Author.
Davis, E.A., & Linn, M.C. (2000). Scaffolding students’ knowledge integration: Prompts for reflection in KIE.
International Journal of Science Education, 22, 819–837.
Department for Education and Employment. (1999). The national curriculum for England. London: HMSO.
Retrieved September 25, 2008, from http:/ /www.standards.dfes.gov.uk/schemes2/science/.
7. Make logicalconnection betweenevidence andscientific explanation
Makes no logicalconnectionbetween evidenceand scientificexplanation
Makes weak connectionbetween evidenceand scientificexplanation
Makes some connectionbetween evidenceand scientificexplanation
Makes logical connectionbetween evidence andscientific explanation
8. Use a variety oftechnologies forinvestigation
Uses no technologiesfor investigation
Uses technologiesineffectively forinvestigation
Uses technologies withsome effectivenessfor investigation
Uses technologieseffectively forinvestigation
9. ‘‘Use mathematicaltools and statisticalsoftware’’ meansstudents shoulduse these forcollecting, analyzing,and displaying datain charts and graphsand for doingstatistical analysis
Uses no mathematicaltools and statisticalsoftware
Uses mathematical toolsand statisticalsoftware ineffectively
Uses mathematical toolsand statisticalsoftware with someeffectiveness
Uses mathematical toolsand statisticalsoftware witheffectiveness
10. Communicate throughscientific paper forreplication andenhancement
Communicates throughscientific paperswith no allowancefor replication and/orenhancement ofinvestigation
Communicates throughscientific paper withless clarity andaccuracy causingdifficulty forreplication and/orenhancement ofinvestigation
Communicates throughscientific paper withsome clarity andaccuracy that maynot fully allow forreplication and/orenhancement ofinvestigation
Communicates throughscientific paperwith clarity andaccuracy to enablereplication and/orenhancement ofinvestigation
11. Defend scientificargumentsconnected withinvestigation,evidence, andscientificexplanation
Defends no scientificarguments connectedwith investigation,evidence, andscientificexplanation
Defenses scientificarguments weaklyconnected withinvestigation,evidence, andscientific explanation
Defenses scientificargumentsreasonably connectedwith investigation,evidence, andscientific explanation
Defenses scientificarguments stronglyconnected withinvestigation, evidence,and scientificexplanation
114 EBENEZER, KAYA, AND EBENEZER
Journal of Research in Science Teaching
Dori, Y.J., & Sasson, I. (2008). Chemical understanding and graphing skills in an honors case-based computerized
chemistry laboratory environment: The value of bidirectional visual and textual representations. Journal of Research in
Science Teaching, 45(2), 219–250.
Duschl R.A. Grandy R.E. (Eds.), (2008). Teaching scientific inquiry: Recommendation for research and
implementation. Rotterdam, Netherlands: Sense Publishers.
Ebenezer, J.V. (2001). A hypermedia environment to explore and negotiate students’ conceptions: Animation of the
solution process of table salt. Journal of Science Education and Technology, 10(1), 73.
Ebenezer, J.V., & Puvirajah, A. (2005). WebCT dialogues on particle theory of matter: Presumptive reasoning
schemes. Educational Research and Evaluation: An International Journal on Theory and Practice, 11(6), 561–589. Special
Issue: The Role of Research in Using Technology to Enhance Learning in Science. Guest Editors: Zacharias C. Zacharia
and Constantinos P. Constantinou.
Ertmer, P.A. (2005). Teacher pedagogical beliefs: The final frontier in our quest for technology integration.
Educational Technology Research and Development, 53(4), 25–39.
Flick, L., & Bell, R. (2000). Preparing tomorrow’s science teachers to use technology: Guidelines for Science
educators. Contemporary Issues in Technology and Teacher Education [Online serial], 1(1), Available: http:/ /
www.citejournal.org/vol1/iss1/currentissues/science/article1.htm.
Friedrichsen, P., Munford, D., & Zembal-Saul, C. (2003). Using inquiry empowering technologies to support
prospective teachers’ scientific inquiry & science learning. Contemporary Issues in Technology and Teacher Education,
3(2), 223–239.
Glazer, E.M., & Hannafin, M.J. (2006). The collaborative apprenticeship model: Situated professional development
within school settings. Teaching and Teacher Education, 22, 179–193.
Griffin, A.R., & Carter, G. (2008). Uncovering the potential: The role of technologies on science learning of middle
school students. International Journal of Science and Mathematics Education, 6(2), 329–350.
Hess, G.R.,&Cheshire, H.M. (2002). Integrating spatial information technologies into forestry and natural resources
curricula. Journal of Forestry, 100(1), 29–34.
Hoadley, C.M.,&Linn,M.C. (2000). Teaching science through on-line peer discussions: Speakeasy in the knowledge
integration environment. International Journal of Science Education, 22, 839–857. http:/ /www.globe.gov/.
International Society for Technology in Education. (2008). National Educational Technology Standards for Students.
Retrieved September 12, 2008, from http:/ /www.iste.org/Content/NavigationMenu/NETS/ForTeachers/2008Stan-
dards/NETS_for_Teachers_2008.htm.
Jackson, S., Krajcik, J., & Soloway, E. (2000). Model-It: A design retrospective. In JacobsonM. &Kozma R. (Eds.),
Advanced designs for the technologies of learning: innovations in science and mathematics education. Hillsdale, NJ:
Erlbaum.
Kerski, J., (2003). The implementation and effectiveness of GIS technology and methods in secondary education.
Journal of Geography 102(3), 128–137.
Kolodner, J.L., Owensby, J.N., &Guzdial,M. (2004). Case-based learning aids. InD.H. Jonassen (Ed.), Handbook of
Research for Education Communications and Technology (2nd ed., pp. 829–861). Mahwah, NJ: Lawrence Erlbaum
Associates.
Knezek, G., & Christensen, R. (2002). Impact of new information technologies on teachers and students. Education
and Information Technologies, 7(4), 369–376.
Kwon,O.H. (2002). The effect of calculator-based ranger activities on students’ graphing ability. School Science and
Mathematics, 102, 5–15.
Lambert, J., Gong, Y., & Cuper, P. (2008). Technology, transfer, and teaching: The Impact of a single technology
course on preservice teachers’ computer attitudes and ability. Journal of Technology and Teacher Education, 16(4), 385–
410.
Lapp, D.A., & Cyrus, V.F. (2000). Using data-collection devices to enhance students’ understanding. Mathematics
Teacher, 93(6), 504–510.
Lederman, N.G. (2004). Syntax of nature of science within inquiry and science instruction. In L.B. Flick & N.G.
Lederman (Eds.), Scientific inquiry and nature of science (pp. 301–317). Bordrecht: Kluwer Academic Publishers.
Levin, T., & Wadmany, R. (2008). Teachers’ views on factors affecting effective integration of information
technology in the classroom: Developmental scenery. Journal of Technology and Teacher Education, 16(2), 233–263.
Liang, L.L., Ebenezer, J., &Yost, D.S. (2010). Characteristics of pre-service teachers’ online discourse: The study of
local streams. Journal of Science Education and Technology, 19(1), 69. http:/ /www.springerlink.com/openurl.asp?gen-
re¼article&id¼doi:10.1007/s109509-9179-x.
Marcum-Dietrich, N.I., & Ford, D.J., (2002). The place for the computer is in the laboratory: An investigation of the
effect of computer probeware on student learning. Journal of Computers in Mathematics and Science Teaching, 21(4),
361–379.
STUDENTS IN ENVIRONMENTAL RESEARCH PROJECTS 115
Journal of Research in Science Teaching
Metcalf, S.J., & Tinker, R. (2004). Probeware and handhelds in elementary and middle school science. Journal of
Science Education and Technology, 13(1), 43–49.
National Space-Based Positioning, Navigation, and TimingCoordinationOffice. (2009). Global Positioning System:
Serving the world. U.S. Government/Coast Guard Navigation Center. Retrieved on April 13, 2010 http:/ /www.gps.gov/.
National Commission on Teaching and America’s Future (NCTFAF). (1997). Doing what matters most: Investing
in quality teaching. NY: Author, Teachers College, Columbia University. http:www.ncta.org/documents/
DoingWhatMattersMost.pdf.
National Commission onMathematics and ScienceTeaching for the 21st Century. (2000). Before it’s too late, a report
to the nation from theNationalCommission onMathematics and ScienceTeaching for the 21stCenturyU.S.Department of
Education, Retrieved on September 27th, 2009 from http:/ /www.ed.gov/inits/Math/glenn/report.pdf.
National Research Council. (1996). National science education standards. Washington, DC: National Academy
Press.
National Research Council. (2000). Inquiry and the national education standards. Washington, DC: National
Academy Press.
Office of Technology Assessment. (1995). Technology access and instructional use in schools today. Washington,
DC: U.S. Congress. Retrieved onMarch 23, 2009 from http:/ /www.princeton.edu/�ota/disk1/1995/9541/954105.PDF.
Parr, C.S., Jones, T., & Songer, N.B. (2004). Evaluation of a handheld data collection interface for science learning.
Journal of Science Education and Technology, 13, 233–243.
Polman, J., & Pea, R. (2001). Transformative communication as a cultural tool for guiding inquiry science. Science
Education, 85(3), 223–238.
Ramos, B.,Miller, S.,&Korfmacher, K. (2003). Implementation of a geographic information system in the chemistry
laboratory: An exercise in integrating environmental analysis and assessment. Journal of Chemical Education, 80(1), 50–
53.
Rutherford, J.F., &Ahlgren, A. (1989, 1990). Science for all Americans. AmericanAssociation for the Advancement
of Science: Project 2061. New York: Oxford University Press. 272 p.
SAMPI. (2005, 2006, 2007). Translating innovative technologies into classrooms. External evaluation report
submitted to the National Science Foundation. MI: Western Michigan University.
Sanders, R.L.J., Kajs, L.T., & Crawford, C.M. (2001). Electronic mapping in education: The use of geographic
information systems. Journal of Research on Technology in Education, 34(2), 121–129.
Sandoval, W.A., & Reiser, B.J. (2004). Explanation-driven inquiry: Integrating conceptual and epistemic scaffolds
for scientific inquiry. Science Education, 88, 345–372.
Schultz, S. (2003). Probe science teaching and learning. Stanford Educator. Stanford University, Spring 2000, 13
June.
Tate, E. (2005). Hanging with friends, velocity style! A preliminary investigation of how technology-enhanced
instruction impacts students’ understanding of multiple representations of velocity. Poster presented at the annual meeting
of the American Educational Research Association, Montreal, Canada.
Wiske (2001). New technologies to support teaching for understanding. International Journal of Educational
Research, 35, 483–501.
Yildirim, S. (2009). Effects of an educational computing course on preservice and inservice teachers: A discussion
and analysis of attitudes and use. Journal of Research on Computing in Education, 32(4), 479.
Zerger, A., Bishop, L.D., Escobar, F., & Hunter, G.J. (2002). A self-learning multimedia approach for enriching GIS
education. Journal of Geography in Higher Education, 26(1), 67–680.
116 EBENEZER, KAYA, AND EBENEZER
Journal of Research in Science Teaching