Does Affect Impact Student Achievement?


Background

Educators are experiencing an undue pressure to perform in this era of education accountability driven by evidence-based instruction. The pressure to show adequate student performance on standardized tests has caused many educators to allocate a larger portion of their classroom instructional time to test preparation instead of teaching higher-order learning and thinking skills (Tapia & Marsh, 2004). The shift in teaching time allocation is causing educators to sacrifice other crucial teaching and learning components believed to improve student learning. Such other education components includes: student interest, motivation, self–confidence, the value of the subject matter, and enjoyment (Chamberlin, 2010). In this article, I will define the term student affect, I will present the evolution of this psychological construct, I will present some of the challenges of measuring it, I will explain why I plan to measure student affect in my dissertation research study, and finally I will conclude by explaining affect as it relates to my dissertation research.

Definition of the “Term Student Affect.”

            The term affect in the field of psychology carries many meaning. It is referred to as motivation (Chouinard and Roy, 2008 & Shin, Lee, & Kim, 2009 as cited in Chamberlin, 2010), dispositions (Gresalfi, 2009 as cited in Chamberlin, 2010), belief (as cited in Chamberlin, 2010), emotions (Grootenboer, 2003 as cited in Chamberlin, 2010), and attitudes (Chouinard & Roy, 2008 as cited in Chamberlin, 2010).  The myriad of terms are sometimes confusing. However, Anderson and Bourke (2000) defines affect as a construct consisting of sub-components such as “anxiety, aspiration(s), value, attitude (s), interest(s), and locus of control, self-efficacy, and self-esteem” (p.1). Furthermore, Anderson and Bourke (2010) argues that motivation and affect are two words that carries the same meaning because motivation is shown throughout all sub-components of affect. Thus, the term affect is a complex psychological construct expressed in various words with similar and/or sometimes carrying same meaning.

The Evolution of the Construct and its Measurement

            The psychological construct, affect, gained recognition in the early 20th century, however, researchers did not have instruments or inventories to measure or quantify it at the time (Thompson, 1992 as cited in Chamberlin, 2010). In the 1920s and 1930s affect was considered a non-observable behavior due to an immense interest in behaviorist research. A type of research that concentrated in investigating observable behavior. Because of that, little interest and effort was directed to non- behaviorist research. Thus, researchers of this time period paid little or no attention to the research on student affect. 

However, affect re-gained traction again due to a new breed of researchers in the early 1960s and 1970s. In the past 40 years there has been an increased attention to the research regarding affect especially by researchers in mathematics, science, and the social sciences. During this time, researchers have attempted to define, characterize, and develop instruments for measuring student affect in mathematics more than in any other subject areas. The sheer number of instruments developed to assess affect during this period is enormous and I will not attempt to list them all here, however, I will mention a few of the most popular instruments. A summary of the popular instruments used to measure affects’ sub- components is presented in Table 1 below:

 Table 1

Summary of Student Affect Instruments

Name of Instrument

Acronym

Affect Sub-Component

Grade Level

Person(s) Who Conducted the Study

Attitude Towards Mathematics Inventory

AtMI

Self-efficacy, Value, Anxiety, and Motivation

Secondary: High School

Tapia & Marsh

Mathematics Attitude Scale

None

Value and enjoyment

Tertiary: Freshman in College

Aiken

Mathematics Anxiety Rating Scale

MARS

Anxiety

Tertiary: Freshman to Senior.

Richardson & Suinn

Fennema-Sherman Mathematics Attitude Scale

None

Attitude, Self-Efficacy, Anxiety, and Motivation.

Secondary: High School

Fennema & Sherman

National Longitudinal Study of Mathematical Ability

NLSMA

Attitude

Secondary: Grade 8

School Math Study Group

 

Challenges Associated with Measuring Students’ Affect

            The biggest barrier to measuring affect is the fact that affect is a psychological construct. Adding to the complexity and difficulty in measuring affect is the fact that affect is composed of many sub-components, namely, anxiety, aspiration, attitude, interest, locus of control, self-control/efficacy, self-esteem, and value. Since affect is a psychological construct it clearly consists of non-measurable attributes. Unlike measurable attributes such as length, weight, and height where we as a society have agreed upon units of measurements such as Meters for length, Kilograms for weight, Kelvin for temperature and what have you. Affect attributes such as anxiety, self-confidence, and enjoyment do not have society agreed upon measuring units and therefore are far more difficult to measure (Chamberlin, 2010). Moreover, another fact that makes measuring affect difficult is that each of affects’ sub-component consists of three characteristics. These characteristics of affect are: target, direction, and intensity. Target refers to the objective, activity, or idea to which the feeling is directed, direction refers to the negative or positive direction of the feelings, and intensity refers to the strength degree of the feeling. Thus, with the lack of agreed upon measurement unit and the many characteristics associated with affect, it is indeed difficult to measure.

 Quantifying affects’ sub-components is complex and problematic, but, not impossible. Recently, some psychologists have successfully attempted to quantify and assess some aspects of student affect using sophisticated statistical programs and software in schools. However, a great deal of the research regarding affect still lack empirical evidence (Tapia & Marsh, 2004). Thus, in light with this promising development in measuring affect, I plan to assess three sub-components of affect in my dissertation research study. I believe self-confidence, enjoyment, and value of the subject matter are important factors to measure as they related more closely to student performance in an academic setting.  However, I will not include other sub-components of affect (i.e., anxiety, aspiration, attitude, interest, locus of control) in my dissertation research study as they are not closely related with my research topic and the course of study.

Why Will I Measure Student Affect in my Dissertation Research Study?         

            Affect is an important ingredient for learning. In 1916, Binet and Simon stated that non- intellectual characteristics were the greatest single most important factor affecting student teaching and hence, their learning (Chamberlin, 2010). The non-intellectual characteristics they were referring to at the time is what we call today as student affect. The name student affect has changed over the years from non-intellectual characteristics, to non-cognitive characteristics, to affect components in the present time. Unfortunately at the time, Binet and Simon did not conduct experimental studies nor did they have empirical evidence to support or discredit their claim. However, currently there is ample of evidence from the Trends for International Science Education (TIMMS) supporting the idea that student affect is as important as cognitive components of teaching and learning (Martin & Kelly, 1996, Martin & Foy, 2008; Messick, 1979; as cited in Chamberlin, 2010). The only anomalous data from TIMMS are those by Mullis, Martin, Gonzalez, and Chrostowski (2004). This study did not show a correlation between student affect and academic achievement.

The central focus of my dissertation topic is to determine whether when students’ learning style preferences are matched with instructional materials, student academic achievement will improve. As I continue to examine this hypothesis, I plan to also investigate students affect as one of the factors affecting student learning. I especially would like to investigate the three components of student affect, namely, self-confidence, perceived value of the subject matter, and whether students will enjoy instruction when the learning materials matches their learning style preference. To do this, I plan to use a modified public domain affect inventory instrument created by Drs. W. James Popham of the University of California, Los Angeles (UCLA) and Rick Stiggins of ETS Assessment Training Institute.  Both are experts in educational assessment. Therefore, this instrument will help me collect data to assess the three components of student affect in my research study.

This inventory instrument is similar to the one developed by Aiken (1974) and it assesses student enjoyment, self-efficacy, and how students value the subject matter. This instrument was chosen because it is user friend, appropriate for high school students, with high validity and reliability, and the results are easy to interpret. Thus, this dissertation research study will include a section on student affect assessment and particularly the three sub-components of affect, namely, enjoyment, self-efficacy, and value. I believe this will add value and fill-in an important gap in these two areas of research in education.

 Reference

Aiken, L.R. (1974). Two scale of attitude toward mathematics. Journal for Research in Mathematics Education, 5, 67-71.

Anderson, L. W., & Bourke, S. F. (2000). Assessing affective characteristics in the schools. Mahwah, NJ: Erlbaum.

Chamberlin, S. A. (2010). A review of the instruments created to assess affect in mathematics. Journal of Mathematics Education, 3(1), 167-182.

Mullis, I.V.S., Martin, M.O., Gonzalez, E.J., & Chrostowski, S.J. (2004). TIMSS 2003 International Mathematics: Report Findings from IEA’s Trends in International Mathematics and Science Study at the Fourth and Eighth Grades. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College.

Tapia, M., & Marsh, G. E. (2004). An instrument to measure affect. Mathematics Education Quarterly, 8(2), 56-62.

 

 

 

 

 

 

 

 

 

 

 

<td width="623
About these ads

Chimney Tops, Smoky Mountains National Park


There were many good moments in Gatlinburg and Pigeon Forge, Tennessee. However, this one was among the best. And the best days were many. I do not have a good recollection of the events of each day we spent at the Great Smoky Mountains. I would say, this was either the third or four day there. Each day we took a walk in the wild side to witness the beauty of nature. On this particular day, we went to climb and see the famous Chimney Tops Trail.

DSC_0100 DSC_0101 DSC_0102

This trail is designated as strenuous. Therefore, we packed our rucksacks lightly with some juice, dry fruits, and a sandwich for the Pili-Pili. The trail is located half-way up the mountain on the singularly road to Cherokee, South Carolina. Once you pack your vehicle, the trail start slowly by descending to the bottom of the river. It was a beautiful sight and hugely deceiving of the long and uphill graded hike to come.

DSC_0103 DSC_0106 DSC_0123

Once you cross a few bridges, steep and thoughtfully placed steps starts. There are 256 steps. Pili counted them out of boredom. The trail keeps going up, up, and up, and up again. Meandering like a giant river approaching the ocean. It’s not the hike that brings hordes of people here. It’s the amazing views on the way up and at the top of the chimneys. I know the pictures that you see here don’t give justice to the actual views there. The 2.3 miles up and 2.3 miles down was joyous as anything i have never done in a few years. 

DSC_0137 DSC_0138 DSC_0140 DSC_0154 DSC_0155

Big Results Now: A Quick Fix Solution to Education in Tanzania


The tables have turned. Or am I seeing the work of a magician. Enough with the jokes! Seriously, Tanzania has been the laughing stock in East Africa with regards to its education system for a while. We all know that change do take time. Especially, meaningful and lasting changes in education don’t happen overnight (read here, here, and here). And quick fixes have unintended consequences (read here). However, I am happy to say that Tanzania has found a magic formula to raise student achievement in the shortest amount of time through its Big Results Now program.

Two years ago, the failure rates at the primary, secondary and high school levels were up the roofs (read here). The 2012 examination results for secondary schools was the lowest in the history of the Tanzanian education system. However, in less than a year of BIG RESULTS NOW, we are seeing the highest jump in exam results never seen anywhere in the world of education. Has the system really changed? Or is it a mirage?

What I believe is this, for change to happen, underlying causes needs to be addressed. Has the education system in Tanzania addressed the challenges it faces? Challenges such as lack of teachers, lack of quality instruction in the classrooms, teacher absenteeism, lack of teaching resources, lack of laboratories and lab materials for science related courses. In my sane mind, I can’t believe that all these challenges have been addressed in less than a year. Unless you believe in miracles, which I don’t, something really shady is in the works here. As they say in Swahili “kuongeza ukubwa wa magoli” is not a genuine solution to this problem. The problems facing the education system in Tanzania are multi-faceted and needs multi-faceted solutions to address them. Quick fixes, No. They will just create a spillover effect. What I see is a disaster in the making. The consequences of which, will be difficult to remediate with simple and quick fixes.

 

Smoky Mountains National Park


This year we decided to chart a new course for our family summer vacation. We decided to take a path less traveled. Once you have been to the Sunshine State too many times, it becomes less difficult to choose to go elsewhere. I have no complaints with my vacations in Florida. Florida is always going to be the best destination for a summer vacation. With all the amusement parks, serene beaches, and warm weather. I love the place and I could visit there anytime. However, July’s Florida heat can be a turd too much to bear sometimes.

As we were trying to expose our daughter to other forms of summer travel adventures this year. We decided to climb the mountains. The decision was easy. While there, we saw some of the best kept secret places in the Southeastern Mountains Ranges of the United States. Gatlinburg is at the base of Smoky Mountain National Park. Next to it, is Pigeon Forge, the land of Dolly Patton. While there, you can do just anything touristy like amusement parks, you can scare your pants off by visiting many of the Ripley’s scare places or you can grab a cabin in the mountains and live a completely quiet week all to yourself. We chose the latter.

Here are a few pictures from my nature hikes at the Laurel Falls and Clingmans Dome. Enjoy.

 

Laurel Falls Nature TrailDSC_0083

DSC_0081

DSC_0065

DSC_0061

DSC_0060

DSC_0049

DSC_0046

DSC_0027

DSC_0026

DSC_0023

DSC_0018

DSC_0016

DSC_0015

Laurel Falls Nature Trail

Laurel Falls Nature Trail

The Counter-factual Causal Inference in Experimental Design


By: Shaaban Fundi

The article entitled “The Qualitative Method of Impact Analysis” by Mohr (1999) attempts to qualify qualitative study design as rigorous and explicit methods for impact analysis (Impact evaluation purposes). In this article, Mohr discusses the problems facing qualitative methods when it is used to study impact. He asserts, impact is fundamentally a causation type of a problem and causation type of impact analysis are better evaluated using a quantitative methodology. Mohr argues that the main issue here lay squarely on the definition of causality. The most accepted definition of causation is based solely on the counter-factual definition of causality. That is if Y occurs, then, X must have occurred. This align perfectly with the quantitative methodology of impact evaluation. According to Mohr (1999) a more defensible version of the counter factual definition is called factual causation. Factual causation states that “X was caused by Y if and only if X and Y both occurred and, in the circumstances, if X had not occurred, then neither would Y” (Mohr, 1999; p. 71). Because of this, causation is better established when things are compared. Thus, causality is derived from the comparison of results from the experimental group to those in the control group. Without this bases of putting two sets of observations together to determine the variance on the treatment variable statistical analysis would not be possible.
Based on the counter-factual definition of causality it seems impossible to use qualitative methodology to evaluate impact. To determine impact, qualitative methods must rely on something other than evidence of counter-factual to establish causal inferences. Therefore, it renders impossible for a qualitative methodology to show the concurrence of X and Y without the use of a treatment group and a control group that is prevalent in quantitative designs. However, Stricken (1976 as cited in Mohr, 1999) offer us an approach called the “modus operandi’ method that could be used to bypass the counter-factual definition of causality. The modus operandi method can be described as follows: It is an elimination process. For example, to demonstrate that treatment T has caused Y to occur, other possible causes of Y such as U, V, and W must be eliminated as contenders for causing T to occur through elimination. The modus operandi is commonly used in daily works of professionals such as doctors, police, educators, and investigators. Modus operandi does not meet the counter-factual definition of causality used in quantitative study designs. However, because of the modus operandi methods, qualitative study designs can be used to determine impact of programs using the elimination process to determine causal inferences. Thus, no variables are needed to establish causation in qualitative designs because physical causality rather than factual causality does indeed produce compelling evidence for ascertaining the occurrence of T occurring when Y occurred after all the other contenders have been eliminated. Moreover, causal reasoning can be reliably used in qualitative designs to determine causal inferences in program and impact analysis.
I enjoyed reading this article because it offered me very practical and useful insights in conceptualizing causality inferences. I have learned that the debate on causation between researchers in the quantitative design and those in the qualitative design is hugely centered on the definition of causation. For the supporters of the quantitative design, causation is defined majorly based on the counter-factual definition of causality. That, causation is determined through comparing two sets of variables (control and experimental values). On the other hand, the proponents in the qualitative design camp sees that causation can be established through the elimination process. They argue that the process of elimination is commonly used in our daily lives without a comparison and/or variables. I can relate this to my research. There are several similarities between my research design and the process of elimination described in this article. My research follows the quantitative design tradition but does not involve a control group. Thus, the causal inferences I can draw from my research design (single-participant research design) are largely a result of better controls of the internal threats to validity rather than the comparison of results from the control group and that of the experimental group because none exists. Thus, as a researcher I plan to incorporate the useful, practical, beneficial insights, and steps of determining causal inferences discussed in this article in my own research especially during the design phase (to eliminate all other possible causes that may have caused the increase in student scores) of the research and during data interpretations.
Reference
Mohr, B. L. (1999). The qualitative method of impact analysis. American Journal of Evaluation, 20 (1), 69-84.

Learning and Teaching Style Assessment


Multiple Intelligences

                In his 1983 book Frames of Mind: The Theory of Multiple Intelligences, Gardner challenged the traditional definition of intelligence as being too narrow and argued that a broader definition was needed in order to more accurately reflect the different ways that humans think and learn.  Each individual, he argued, possesses a unique blend of multiple intelligences (MI) and he opposed the idea of using the same techniques to teach and assess every child.  He defined eight types of intelligences including: musical–rhythmic, visual-spatial, verbal-linguistic, logical–mathematical, bodily–kinesthetic, interpersonal, intrapersonal, and naturalistic. When I took the MI test, I was not surprised to find out that I have the naturalist intelligence with some musical-rhythmic intelligence. This observation explains perfectly the path I took during the early years of my education. I spent four years of my undergraduate education studying marine science and microbiology, two years in graduate school studying environmental science with a specialty in water resource management, and three years studying a master’s degree in science education. Currently I am pursuing a PhD in curriculum and Instruction.
I have always been fascinated with nature and the natural environment. I have been especially fascinated by the interdependence amongst living things, their interaction with each-other, with other species, and the environment. I now realize how my MI affect the way I teach and learn. Furthermore, I have come to understand that my MI could have a positive or a negative effect on my students’ learning experiences in the course. I plan to diversify my teaching and learning strategies to meet the varied MIs of all students in my courses.

Self Assessment

            As I reflect on the strength and weaknesses of my teaching, three things comes to my mind. First, I believe I have a firm understanding of content knowledge in chemistry, environmental science, and ecology. Second, I believe that I have a firm understanding of teaching methodology in science education. Third and last, my experience teaching and learning in two contrasting schooling environment in the United States (urban resource poor schools and suburban resource rich schools) has added tremendous value to my teaching experience.  I believe a combination of all these factors has made me a better educator not only for content knowledge, but also for emotional knowledge, values, and critical thinking skills. Like everything in life, I realize that I am no near perfect at what I do as an educator. There is always a room for improvement. Thus, I would like to improve on two things. First, communication with stakeholders. I have found myself in troubled situations on many occasions due to lack of communication. This stem from my belief that I and only I should handle course related problems. I realize that opening up to others’ suggestions may be a good thing. Therefore, I plan to open up a little and hear advice from others. It’s not a weakness to incorporate others’ point of views into your own. Second, I tend to offer too many choices to students, choices on what to do, how to it, and on how they should represent their work. It becomes difficult to give students a fair assessment on their products especially when everyone decide to do and represent their work differently. I plan to stream-line my assignments and projects to allow for some level of standardization especially in light of the accountability educational era we working under.

 Peer-Assessment

Self-evaluation can be a good thing, however, because of inborn biases inherent to this process I decided to call my co-teacher and ask him to evaluate my teaching. This process will help me to understanding the areas of strength and weaknesses that my peers see in my teaching. Therefore, I asked Mr. Miller to reflect on my teaching and especially the areas where he sees strength and those areas that he sees I need improvement. Based on our conversation, these are some of the highlights and the lowlights of my teaching.

The highlights: He thought I was very good at managing instructional time and students. He thought I handled classroom related issues appropriately, and I do a good job at making sure each student has a say in the course. He also pointed to the fact that I seem to be fair in my treatment of all students and also in grading students’ work. He added that I do a good job in connecting what is learned in the course to students’ prior, present, and future interests. He though I do a good job at making content relevant to students’ lives. He also noted my pleasant and jovial mood. This makes my class a place where every student want to be and feels appreciated.

The lowlights: He mentioned my low-level of communication with parents and other stakeholders regarding students’ progress or lack thereof in class. He also noted that I tend to repeat concepts a lot which can be a good thing and sometimes a bad thing depending on the group of student in the class.

Student Assessment of my Teaching

            It is my custom to ask my students’ opinion about the courses I teach. I always try to give them an opportunity to reflect on my teaching. I find this type of evaluation refreshing and an important part in improving my craft as an educator. This year it was no different. At the end of the semester I created a course evaluation post in my blog where my student could go and evaluate the course. On the blog post I asked my students to rate my teaching on three aspects: 1) what did I do well in my teaching? 2) what I did not do very well, 3) if you were to take this course next year, how would you like me to teach the course? The reflections from my students were as varies as they were interesting. In general, most students enjoyed the relaxed atmosphere in the class. They reported enjoying the opportunity to engage in hands-on activities, creating videos for some of the projects, and presenting their ideas to the class in a form they felt comfortable with. Some the things they did not like were: 1) lack of immediate feedback, 2) I spent more time on easy topics (such as the periodic table and physical and chemical changes) and less time on harder concepts (such as nomenclature, stoichiometry, and gas laws). Thus, next school year I plan use some of the suggested ideas to make the course and the environment under which the course is taught better. I know that as educators we tend to maximize the content and cognitive aspects of teaching and learning while forgetting the student affect side of learning. I plan to pay more attention to the student affect side of learning especially in areas such as self-confidence, how they value the course, and course enjoyment. In my 11 years as an educator, I have come to the realization that when the course is not enjoyable,has little or no value to the students; students tend not to care much about the course. I am constantly working to change that.

Buying or Selling in Tanzania?


Africa is changing really fast. That is a fact. Is Africa changing for better or for worse? This is a fundamental question, however, the answer to this question depends entirely on your worldview. Are you looking at Africa in terms of its economic opportunities? Or are you looking at Africa in terms of its destruction to the environment? Do you believe in a sensible and sustainable growth? In this series of posts about Africa, I will look at Africa and Tanzania in particular using the lens of Tanzanias’ communication evolution.

So, how technology especially mobile phone usage has changed in Tanzania over the years? Technology is changing the way people access and process information in Tanzania. I remember vividly when cellphones were introduced in the early 1990s. Service was sporadic, expensive, and mainly a symbol of status in society rather than a form of communication. Fast forward 20 years later, the World Bank claims that an average Tanzanian has two or more mobile phones (World Bank, 2012). According to the World Bank website on technology, the service is quite good and can be accessed almost anywhere in the country. The advancement of mobile phone usage has coincided with the advancement in internet connectivity and usage. Internet connectivity in Tanzania is still highly accessed through mobile phones. It is what I call the mobile phone internet affair. The combination of cellular phone service penetration and internet connectivity in the country has sparred growth in trade, communication, and soon will be felt in the education sector.

As the share of internet connectivity gets bigger, so does the promise for its use in transforming the education industry in Tanzania. For that matter, here is a link to connect you with that possibility. Hopefully, you will find this new website not just a lone tree in a forest of many advertising websites in Tanzania. At OLX Tanzania we go further to connect you with your clients and the buying needs you may have.  It gives you the penetrability you need and deserve. If you have a cell phone, you can buy and sell from the comfort of your village kigoda. Enjoy the service.

Course Evaluation


Hello everyone. It was great to have you in my course this semester. I hope you enjoyed the experience. In my quest to make the course more enjoyable to you, I would like your input. Additionally, I hope you will find a way to use the information you learned in this course in the near future to make your lives better. As we are approaching the end of the semester, I would like for you to share your opinion about the course by clicking this link. It is my hope that you will take this opportunity seriously and that you will offer genuine suggestions to improve the course.

Here are three things I would like you to respond to:

1) what did you like about the course (think about pacing (too slow, too fast, just about right), information, field trips, out of class activities, in class activities and so forth)?

2) what did you not like?

3) what could I have done differently?

This is completely anonymous. Feel free to express your opinion to help me improve students’ experiences in the course.

Good-luck and Happy Summer Y’all!!

 

 

Examining Teachers’ Practical Experiences with Virtual Labs in High School Science: A Narrative Study


CHAPTER ONE

STUDY RATIONALE AND PURPOSE

Problem Statement

Virtual Laboratories are quickly replacing hands-on laboratory activities as the norm for teaching and learning science in the high school setting (Van Lejeune, 2002). Van Lejeune (2002) and Mint (1993) describe three main reasons for this shift.  First, materials for hands-on laboratory activities are very expensive. Second, the use of chemicals in the classroom could potentially lead to lawsuits if chemicals are not properly handled by either the teacher or student. Third, virtual labs can provide a quality experience for students, especially if the teacher lacks in-depth knowledge of the subject being taught. Research findings by Redish and Steinberg (1999) suggest that students learn most effectively in an active engagement learning environment.  Virtual labs, if used properly, can create and foster this kind of active learning environment. Virtual labs also provide a cheaper alternative to school systems struggling with tight budgets (Van Lejeune, 2002) and eliminates the potential for lawsuits associated with the use of strong or potentially poisonous chemicals (Mint, 1993).

Despite the numerous potential benefits associated with using virtual laboratories to teach science in the high school setting, few studies have been conducted to assess teachers’ practical experience with using virtual laboratories and how these experiences can be used to identify best practices for improving praxis among teachers, especially for new science teachers. Results from several studies suggest that online labs and videos can be as effective as physical or hands-on lab activities (Leonard, et al., 1992; Malderelli, 2009; Cengiz, 2010; Gobert, et al., 2011; Tatli, Z. & Ayas, A, 2013; Kun-Yuan, Y. & Jian-Sheng, H. 2007).  In addition, a study among high school students identified a number of positive effects associated with using technology in the classroom (Reid-Griffin & Carter, 2004). These positive effects include improved student achievement and better student engagement.  Furthermore, the individualized nature of technology empowers students to take more risks in their learning and to be more willing to make mistakes. Controversy around virtual labs remains, however, as some researchers (Kennepohl, D. 2001; Nedic, Z., Matchoska, J., & Nafalski, J. 2003; Finkelstein, et al., 2005) have found online labs to be less effective than hands-on labs.  These researchers also found that students preferred face-to-face labs over virtual labs.

Despite the mixed evidence around the effectiveness of virtual laboratories, the use of these labs in high school science classrooms continues to rise.  The purpose of this research study is to elucidate teachers’ practical experiences with using virtual laboratory activities in their science classroom.  Understanding how teachers experience and use virtual labs in their classroom may provide some context for explaining the discrepancy observed in the literature on the effectiveness of virtual labs at improving student outcomes.  

Why Is Organizational Learning Important?

Learning is an everyday occurrence for most humans (Dewey, 1938). The success of the human race, can in large part, be attributed to the ability of humans to learn and to use that new knowledge to adapt to changes in their environment.  Humans, unique among animals, are able to create and share knowledge.  This shared knowledge allows them to make improvements in their environment or organization. This type of learning is called organizational learning (Argyris & Schon, 1978).  To improve practice in organizations, including schools, it is crucial to understand shared practical experience.

Moreover, there are three types of informational knowledge.  These include: (1) the hard and formal character of knowledge (Childreth & Kimble, 2002); (2) the paradigm mode of knowing (Bruner, 1986); and (3) the soft, tacit, and practical knowledge (Takeuchi, 1995). Current research indicates that soft, tacit, and practical knowledge can be meaningfully captured using a narrative inquiry approach (Boje, 2007; Czarniawski, 2007; & Gabriel, 2000). This study, therefore, will use a narrative approach to investigate teachers’ shared practical experiences with using virtual laboratories to teach science in their high school classrooms. It is my assumption that teachers hold valuable personal and practical knowledge. This study will gather that personal and practical knowledge in order to facilitate the sharing of best practices with teachers unfamiliar with the use virtual laboratories as a teaching tool. This information will be especially useful for new science teachers who most often find themselves using virtual labs in their classrooms with little or no training.

What Led Me to This Topic?

I was born and raised in Tanzania. I attended school in Tanzania for primary school, secondary school, high school, and University. I came to the United States in 2001. I attended a graduate program in environmental science at Towson University from 2002 to 2004. While attending graduate school I worked as an Assistant Laboratory and Field Technician for the Center for Urban Research and Environmental Education at the University of Maryland, Baltimore County. In my capacity as an environmental lab and field assistant, I investigated water, air, and soil pollution in Baltimore City and Baltimore County. After I graduated with a Master degree in Environmental Science, I decided to teach for the Baltimore City Public School. I applied to the Baltimore City Teaching Residency (BCTR) in May, 2004, a program designed to attract experienced science and mathematics professionals to teach in the Baltimore City School System. I was accepted into the program and was formerly hired as a teacher in July, 2004. Through the BCTR program, I attended the Johns Hopkins University for a Master’s Degree in Education from July 2004 to May 2007.

Throughout my teaching career, I have witnesses many changes in the technology used in schools. When I was first hired as a science teacher, I had little exposure to classroom technology and its uses. I found it very hard to implement a new technology in the classroom especially when little or no training was offered to accompany that training. For the past five years, I have been using virtual laboratories to teach high school chemistry.  These labs teach a variety of concepts including the difference between chemical and physical changes, the periodic table, naming compounds, and the concept of “moles”.  I have found virtual laboratories to be an effective tool for teaching concepts where a hands-on lab either does not exist or is too expensive or dangerous to conduct.  Since many schools are shifting their investments from hands-on labs to virtual labs, I thought, it would be important to gather teachers’ personal and practical experiences with virtual labs to inform this shift and to identify best practices that could be shared with other teachers.  I plan to capture the experiences that teachers have when using virtual labs with their students through their narratives.

Conceptual Framework

            Learning from experience is central to the creation of practical knowledge in an organization (Cole & Wertsch, 2004). Dewey (1916) suggests that learning from experience is crucial in connecting the past, the present, and the future (as cited in Liu & Mathews, 2005). This study will examine learning from experience through the Vygotskyan social constructivist lens and also through personal reflections. According to Wolcott (1990a), personal experiences can be used to examine a phenomena such as teachers’ personal and practical experiences with virtual labs.

           Social constructivist theory originated from Vygotsky’s work. Social constructivist theory emphasizes collaboration and views learning or meaning as being socially constructed (Resnick, 1991). A central concept of Vygotsky’s work is the role that social collectivity plays in learning and development (Liu & Mathews, 2005). Individuals learn from each other and form their understanding of the world from their interactions with each other.  Social constructivist theory, however, is not without criticism. The major criticism of this theory is that it places too much emphasis on the role of social and collective, but, ignores the role of the individual in meaning construction.  While I acknowledge this criticism, I plan to use social constructivist theory as the basis for my study because I believe that teachers share their experiences with teaching tools, like virtual laboratories, with each other and it is through this communication that they decide whether or not to use these tools in their own classroom.  Thus, I feel that this theory is most aligned with the purpose of my study.  Figure 1 below illustrates the conceptual framework for my study.

 

Informed Future

Practical Experience

Teachers’ Past Experiences

Teachers’ Present Experiences

 

 

 

 

 

Socially Constructed Meaning through Stories

                                                

 

 

Figure 1: Socially Constructed Practical and Personal Experiences of Teachers When Using Virtual Labs.

 

Study Rationale

As mentioned earlier, the use of virtual labs and online learning continues to rise in in high school science courses. This rise in virtual lab usage has implications on how successfully the learning experiences are going to be for teachers and students. This research will identify teachers’ practical and personal experiences with virtual laboratory activities to help create a body of best practices for other teachers. As noted in my personal and professional narrative, most teachers do not actually receive formal training on how to effectively use virtual labs with their students. Therefore, teachers learn through trial and error how best to implement virtual labs in their classrooms.  The risk, however, is that they will not utilize virtual labs correctly, leading to poor student outcomes. This study will gather teachers’ experiences with virtual labs, including the knowledge they have acquired through the use of virtual labs in their own classrooms.  Best practices will be identified and shared with other teachers who are considering implementing virtual labs in their own classrooms.

Research Questions

As in any qualitative study, choosing the type of qualitative inquiry and the questions to fit the approach is the first challenge. In the beginning, I explored various approaches to qualitative inquiry to see which approach was most appropriate to answer my research questions.  After, much deliberation, I chose narrative inquiry to investigate teachers’ practical experience using virtual labs in their classrooms. In my interview, I asked eight main questions to elucidate teachers’ experiences with virtual labs.  These questions are listed below:

  1. Tell me about your educational and professional background.
    1. Probe: How did you become an educator?
  2. What is your teaching philosophy?
  3. How do virtual labs fit within this philosophy?
  4. How did you learn about virtual labs?
  5. When did you start using them?
  6. Why did you decide to use virtual labs in your classroom?
  7. What do you see as barriers and benefits to using virtual labs with your students?
  8. What adaptations (if any) did you make to ensure that all students in your class benefit from virtual labs?

Summary

In Chapter One, I provided a rationale for my research study and presented the theoretical framework that will form the basis of my study.  In addition, I reviewed the questions that I asked the teachers participating in my study in order to elucidate their experiences using virtual laboratories.  In Chapter Two I will review the origin and definition of several key terms related to my study including: Social Constructionist Theory, Deweyan Experience, Schon’s Reflective Practitioner, and Narrative Inquiry.

 

 

 

 

 

 

 

 

 

 

CHAPTER TWO

REVIEW OF THE LITERATURE

Chapter One provided an overview of the purpose of this research study and described the theoretical framework that will be used as the basis of this study. Chapter Two will continue this discussion by reviewing some key terms related to the study. The terms described herein are “Social Constructivist Theory”, “Reflective Practitioner”, and “Experience”. This chapter will also describe Narrative Inquiry which forms the basis of this research study. At the surface level, these terms appear very different, but, at a deeper level they have inter-related meanings.

Social Constructivist Theory: A Vygotskyan Idea

            As described in Chapter One, social construction theory emphasizes the importance of collaboration and views learning or meaning making as socially constructed (Resnick 1991; as cited in Liu & Mathews 2005).  A central concept in Vygotsky’s work is the role that social collectivity plays in learning and development (Liu & Mathews, 2005).  Vygotsky’s social constructivist theory argues that “knowing is relative to the situations in which the knowers find themselves” (Liu & Mathews; 2005; p.392).  The concept of social and the individual being interconnected is the cornerstone of the social constructivist theory and it provides a valid explanation for social and individual change.

Reflective Practitioner: Schon’s Idea

Schon (1986) describes reflection as what practitioners do to examine their increased understanding of a phenomenon that arises from practicing. Reflectivity combines reflections from both past and present actions in order to improve future actions. Schon emphasizes that knowing with doing and thinking with action must go together because they work hand-in-hand.  We cannot “know” and “think” without “doing” and “acting” (Schon, 1986).  Thus, thinking with action is crucial to improving practice. In my experience, teachers and school administrators rarely use reflective action to enhance their praxis. Part of this research study will be encouraging teachers to use narratives or storytelling as form of reflection in action in order to improve and transform their teaching practice.

Experience: A Deweyan Idea

Dewey (1916) views experience as a continuum of reason. Dewey’s work shows his attempt to resolve the dichotomy between experience and reason.  According to Dewey (1916) experience and reason are a continuous mesh of consciousness most meaningful when connected to everyday life. There are two natures of experiences described by Dewey. The first is “trying” which is related to active experience. The second is “undergoing” which is related to passive experience. Dewey was more concerned with active experience because it involves changes of actions through reflection. In order to better understand the nature of active experience, I identified two qualitative studies that described the experiences of teachers who became students.  Their experiences as students helped them identify strategies to improve their teaching.  Mann (2003), a college professor, described her own experience as a student attending an online course.  From her experience, she identified several strategies that teachers can use to foster student learning in a virtual environment.  Similarly, Sinclair (2004; as cited in Case, Marshall, & Linder,2010) spent two years as a student in a mechanical engineering program.  During her time as a student, she identified several challenges that students encounter when entering a new discourse or discipline. She also identified strategies that educators can use to help their students be successful in a new discourse.

 

The two studies illustrate the need to understand teachers’ experiences with virtual labs as it may be one strategy to foster student learning in a virtual environment. Currently, little research has been done in this area, especially among high school science students. My study will address this existing gap in the literature by exploring teachers’ experience with virtual labs using a narrative inquiry approach. In addition, the teachers’ experiences and stories from my exploratory study will help other educators understand the challenges and opportunities associated with using virtual labs in their classrooms, including identifying best practices for integrating virtual labs into the science classroom.

Narrative Inquiry: Stories as a Reflective in Action Tool

Creswell (2013) identifies several approaches to conducting a narrative inquiry. These approaches include: biographical studies, auto-ethnographies, life histories, and oral histories. In my exploratory study, I used a life story narrative approach.  I am not, however, trying to portray the person’s entire life history.  Instead, my questions will focus on capturing a defined time period in the lives of two teachers, namely their experiences using virtual labs as a teaching tool in their high school chemistry course.   This life story narrative approach will take the form of a personal experience story. Denzin (1989a; as cited in Creswell, 2013) states that a personal experience story may be used to study an individual’s personal experience in a single episode and/or in multiple episodes. In this pilot study, I asked the teachers to recall the episodes where they used virtual labs in their classrooms and to relay to me their personal experiences using these labs in their classrooms.  In addition, I collected information about the teachers’ background.  The information helped me to contextualize how their experiences using virtual labs were influenced by their educational background and their teaching philosophy and how the information can be used as best practices for other teachers with little to no experience with virtual lab usage.

Summary

In Chapter Two I explained the key terms: Social Constructionist Theory, Deweyan Experience, Schon’s Reflective Practitioner, and Narrative Inquiry. In the narrative inquiry tradition, stories are used as a tool for capturing practical experiences through reflection. These stories can then be passed on from person to person in an organization as best practices. Brown, Denning, Groh & Prusack (2005) posits that stories are a powerful tool for sharing practical experiences and knowledge in an organization such as a school or school system.

Chapter 3 explains the research approach I have chosen for this study. I have chosen a narrative approach to conduct my study. In addition, I offer justification to why I chose a narrative approach for this study. Finally, I explain how the data was collected and analyzed using the narrative inquiry approach.

 

 

 

CHAPTER THREE

METHODOLOGY

In Chapter Three, I will explore the reasons behind why I chose narrative inquiry methodology for my research study. I will also explain how I conducted this study from data collection to data analysis, including how I selected my research site and participants.

What Led Me to Choose Narrative Approach

Clandinin & Connelly (2000) argue that practical knowledge gathered from people’s experiences is sharable in the story format.  Narrative inquiry is arguably the best method for capturing those stories and the inherent knowledge to be gained from these stories. In addition, narrative inquiry is a useful methodology for describing an insiders’ experiential knowledge in the form of story-telling (Clandinin & Connelly, 2000).  My intention for this study was to identify practical experiences of teachers (Clandinin & Connelly, 2000). In addition, I wanted to recognize my self –reflective knowledge and how to capture the experience of teachers who use virtual labs with their students.

To put my experience with technology in context, I will provide my life and professional history.  I was born and raised in Tanzania. I went to school in Tanzania from primary school, secondary school, high school, and University. I came to the United States in 2001. After I graduated with a Master degree in Environmental Science from Towson University, I decided to teach for the Baltimore City Public School. I began my career as a teacher in July 2004 at an inner-city middle school. Most of my students were African-American and from low income households. 

When I was first hired as a science teacher, I had little exposure to classroom technology and its uses. I found it very hard to implement a new technology in the classroom since I rarely received any training to accompany the new technology.  In 2009, I began teaching at a suburban high school in Atlanta.  It was at this high school that I learned about virtual laboratories.  I began using these laboratories in my chemistry classroom.  Again, I did not receive formal training on how to use these labs. Instead, I learned by doing.  I believe this is an experience that many new teachers face. Since many schools are shifting their investment from hands-on labs to virtual labs, I thought it would be useful to gather teacher’s personal and practical experiences with virtual labs. The challenge was that personal and practical knowledge is often hard to capture systematically.

In the process of finding which method was most appropriate to answer my questions, I started by trying the phenomenological approach. According to Creswell (2013), a phenomenological study, “describes the common meaning of several individuals of their lived experience of a concept or a phenomenon.” There are two types of phenomenological studies.  The first type is a heuristic phenomenological approach which brings to the fore the personal experience of the researcher (Moustakas, 1990b:9, as cited in Patton; 2002b).  The second type is a transcendental phenomenological approach that involves the researcher bracketing themselves through acknowledging their experiences with the phenomenon under investigation (Creswell, 2013). However, after a careful analysis of the method, I came to the conclusion that a phenomenological study was not the best for my research question because my sample size was too low and also I was relying on a single method for data collection which is not advisable for a phenomenological study. To conduct a well-rounded phenomenological study, a number of data collection methods such as surveys, observations, journaling, and photographs need to be used.

I then turned to a mixed method approach. A mixed method study uses both qualitative and quantitative research designs. In the 1990’s, mixed method study design gained popularity (Creswell, 2011). Green, Caracelli & Graham (1989) define a mixed method study as “research in which an investigator collects and analyzes data, integrates findings, and draws inferences using both qualitative and quantitative  approaches  or methods in a single study or program of inquiry” (p.20). According to Creswell (2011) mixed method study increases the breadth and depth of the research findings. Using more than one research method can also help corroborate the study findings, ensuring the findings have a stronger validity. To use a mixed method design, Creswell (2011) suggests that the research questions must include both quantitative and qualitative elements. It is important that the formulated questions address both the needs for a quantitative and a qualitative study design. Again, after a careful examination of the method and the question, I realize that my questions did not match well with the mixed method design.

While examining a possible method to capture teachers’ experience with virtual labs, narrative inquiry emerged slowly but surely as the best method for capturing teachers’ experience with virtual labs and identifying the practical knowledge inherent in these experiences. Narrative inquiry emerged as a new research method in social research in the 1980s (Clandinin & Connely, 1990). In 1986, Clandinin and Connelly experimented with narratives as an alternative way of representing experience in graduate courses at Ontario Institute of Studies (OISE). According to Clandinin & Connelly (2000), an individual’s story should be considered as a source of phenomenon and method. Atkins (1995) pointed to two advantages of using narrative inquiry. First, the story creating process is similar to the self-reflection process, thus, helps to expand experiences. Second, developing stories helps to connect a person to the human experience through narratives. Therefore, narrative inquiry can be used to gather personal and practical experiences and knowledge and to share them with the community.

Data Collection

This study takes place in a high school environment in an upper income suburban neighborhood in the Southeastern United States. I purposely chose my two participants for the following reasons. First, I wanted them to have different levels of teaching experience.  My first participant was a new teacher (two years teaching experience) who had limited experience with virtual labs. Thus, I chose her because I wanted to understand and chronicle new teachers’ experiences with virtual lab usage in the science classroom. The other participant was a veteran teacher with more than 15 years of teaching experience. I wanted to interview him because I wanted to gain deeper insights into the use of virtual labs by an experienced teacher.  The second reason for choosing these two teachers was a matter of personal convenience.  The participants and I work in the same hallway and have the same planning period; therefore, I have easy access to them.

I used the life story narrative to elucidate the personal practical experiences of my teacher participants. I took the life story approach because I believe each and every one of us has his or her story to tell.  I interviewed each participant for approximately 15 minutes in their classroom using a semi-structured interview guide.  I began with an open-ended question followed by a probing question whenever necessary to gain a deeper conversation of the participant’s experiences.  Even-though I had developed structured and open-ended questions for the interviews, I conducted the interviews mainly as conversations. The reason for choosing a conversational approach rather than a direct interview approach is that probing is most effective when it takes place in the form of a conversation (Clandinin & Connelly, 2000).  Since I was using only a single method of data collection and a small sample of interviewees, I decided to record the interviews so I did not miss any relevant information and so I could produce a verbatim transcript for analysis. I used my IPad to record the interviews with the permission of each participant.  I received human subject approval in September of 2013 and conducted both interviews in October 2013. The open-ended questions used for this study are presented below:

  1. Tell me about your educational and professional background.
    • Probe: How did you become an educator?
  2. What is your teaching philosophy?
  3. How do virtual labs fit within this philosophy?
  4. How did you learn about virtual labs?
  5. When did you start using them?
  6. Why did you decide to use virtual labs in your classroom?
  7. What do you see as barriers and benefits to using virtual labs with your students?
  8. What adaptations (if any) did you make to ensure that all students in your class benefit from virtual labs?

Data Analysis

            After I conducted and recorded the interviews, I attempted to analyze the data through first listening and transcribing the interviews. To better understand the stories, I used the restoring or retelling method to reconstruct the participants’ stories as they were told to me during the interview. I identified and interpreted the major themes such as technology-related problems, when to use virtual labs, when not to use them, and in what instances they most benefit students’ understanding.  I then wrote summary statements for each of the identified themes using the information from the participants’ interviews.  Participants’ narratives are presented in Chapter Four and the conclusions and recommendations resulting from the research findings are presented in Chapter Five.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

CHAPTER FOUR

TEACHER STORIES

In this chapter, I examine each participant’s experience with virtual labs using a story-telling or retelling approach of the narrative inquiry methodology. I begin with a description of the classroom environment followed by narratives from each participant’s interview. In Chapter Five, I examine the data and offer analysis and interpretation. I then, conclude the chapter with conclusions and recommendations for future studies. What is presented here is a verbatim transcript of the participants’ own words. To the best of my ability, I refrained from adding any of my comments or additions to this transcript. However, I sometimes use my own words to help create smooth transitions, where necessary. Note that the names used below are pseudonyms to protect the confidentiality of the participants.

Mr. Physics Jones’s Class

            Mr. Physics Jones is 37-year old, white male who teaches physics and chemistry at the suburban high school. He has 13 years of teaching experience and has been with the science department for 10 years. Prior to working with this department, Mr. Jones worked for a private Christian high school for three years. Mr. Jones is a highly qualified teacher in the broad science category, but specializes in teaching Advanced Placement Physics and general chemistry.

            My classroom is located next to Mr. Jones’ classroom. We normally have lengthy conversations about teaching physics and chemistry. We also share a stock room for chemicals and laboratory equipment that we use to teach chemistry and physics. In his spare time, Mr. Jones likes to run. He is the head coach for the school’s running team. His team has won numerous awards including State championships and zonal championships.

            Mr. Jones’ classroom is very orderly. The classroom is arranged into eight two rows with eight lab desks. He has a promethean board, LCD projector, laptops, and a student response system that he uses on a daily basis. Mr. Jones also has five computers in the back of his classroom that are connected to the internet. Mr. Jones’ students are very diverse with a variety of racial and ethnic groups represented. His students also come from a range of socio-economic backgrounds. His classes contain a fairly even gender distribution.

Mr. Jones’s Experience with Virtual Labs

Mr. Jones has aBachelor’s degree in environmental engineering and chemistry. He also has a Master’s of Arts degree in teaching with a specialty in chemistry education. Mr. Jones worked as a research technician for the state of North Carolina and he is now teaching. He decided to become a teacher because he thought his skills would be better utilized in educating the kids (children) of America. Mr. Jones teaching philosophy is a triangle between the teachers, parents, and the students.

Mr. Jones feels that virtual labs give the kids a tool to explore the topic more on their own. He also feels that virtual labs form a good substitute when the student is absent. If a kid is absent, virtual labs serves as the lab.  Virtual labs also serve as the alternative when we don’t have the funds for the lab equipment or chemicals. Mr. Jones mentioned that he did not receive any professional development regarding the effective use of virtual labs. He learned how to use virtual labs through trial and error. Mr. Jones sees the barriers of virtual labs as that students don’t see the true results of what’s happening. They are pre-programmed and therefore devoid of real life experiences. Another barrier to virtual labs is that students tend to copy from each other without engaging themselves in the actual learning activity. In addition, virtual labs offer the same results, so it’s hard to talk about real life errors (e.g. experimental errors) that often occur during real life experiments.

Mr. Jones sees the benefits of virtual labs as that they can be accessed anywhere at any time. Another benefit of virtual labs is that there is no preparation time for the teacher. In addition, virtual labs are useful at substituting instruction especially when the equipment is too expensive. Mr. Jones uses several adaptations to make sure that all students in his class benefit from virtual labs. First, he discuss the lab with students ahead of time. Second, he does a demo for the class before-hand. Third, Mr. Jones does group discussions to enhance student understanding of concepts covered in the virtual labs session.

Ms. Biology Tanisha

            Ms. Biology Tanisha is a 34-year old black female who teaches general biology in the science department. She has two years of teaching experience and has been with the science department for one year. Ms. Tanisha is a highly qualified teacher in the broad science category, but, specializes to teach general biology. She explained her experience with virtual labs in her classroom during the interview.

            Ms. Tanisha’s classroom is very orderly. The classroom is arranged into two rows with eight lab desks. She has a promethean board, LCD projector, a laptop cart, and student response systems that she uses often. Ms. Tanisha also has five computers in the back of her classroom that are connected to the internet. The students in Ms. Tanisha’s classroom, like those in Mr. Jones’ class, are very diverse in terms of race and socio-economic status.  Her class also has an even distribution of boys and girls. 

 

 

Ms. Tanisha’s Experiences with Virtual Labs

Ms. Tanisha has a bachelor degree in education. She became a certified teacher two years ago. She decided to become a teacher for four reasons.  She is a people person.  She likes showing children the different ways to learn. She likes to give back to the community and she thinks that there are not enough people in the world who want to teach but just want to be a part of something so that they can get vacation time. She wants to give back and show why having a good education is important. She says that growing up, even though her mom and her relatives were educators, she never saw the importance of going home and studying or doing what she was supposed to do. As such her GPA after she graduated high school was below a 2.0 and she actually flunked out of college twice. The third time she went back, including grad school, her GPA was well above a 3.0. She learned the importance of an education but it took her awhile and now she’s at a point in her life that she wants to give back and to show why it’s important to be educated.

Ms. Tanisha’s teaching philosophy is every child can learn, however, not every child can learn the same way. She believes that teachers need to engage all students individually if possible throughout the week. To continually communicate with your students so you know where they are. She also believes that it is up to the teacher to actually engage each one of their students so that they can learn. In addition, Ms. Tanisha feels that virtual labs are good because the teacher has the ability to rewind as opposed to doing a lab in class step one, step two and typically you don’t even have the resources to go back and see where did I get this from? Virtual labs are good for proofing one’s work because of the ability to go back and check where the information came from.

Ms. Tanisha went to the Explore Learning workshop last year where she learned that a teacher had the ability to add to the labs everything he/she needs to engage students in their learning. She feels that the workshop she attended last year was beneficial in making the labs better. It helped her in different ways as far as getting the concepts across to her students. Ms. Tanisha sees the only barrier to virtual labs is that some students are not as engaged as others.  She feels as though these students would prefer hands on labs rather than virtual ones. In addition, some students just don’t want to do the lab because it does not fit their learning style. Ms. Tanisha recognizes several benefits for using virtual labs in the science classroom. First, virtual labs offer the ability to go back and redo the labs up to a certain degree. Second, students have to follow directions and engage themselves in the lab and learn at their own pace. Third, she thinks students learn more because virtual labs follow the learner centered approach.

Ms. Tanisha uses various adaptations to make sure that all students in her class benefit from virtual labs. She walks around and talks to students often to make sure they understand the lab. She checks for understanding and engagement by communicating with each student on a regular basis.

Summary

In this chapter, I presented participants’ narratives. In Chapter Five, complete data analysis, discussions, conclusions, and recommendations for future research will be presented.

 

 

 

 

 

CHAPTER 5

DATA ANALYSIS, CONCLUSION, AND RECOMMENDATIONS FOR FUTURE RESEARCH

In this chapter, I first analyze the data, then, I present the conclusions of the study, and finally, I present my recommendations for future research. 

Data Analysis and Conclusion

Narrative inquiry was used in this study to shed light on the practical experiences that teachers have when using virtual labs with their students. The purpose of the study was to identify personal and practical experiences that teachers have when using virtual labs in their science classrooms. After reading and re-reading the transcripts from the interviews, I identified two characteristics that were shared by both teachers – a love of teaching and a belief that all children can learn.  In addition, I identified several best practices that these participants had used to maximize the success of virtual labs in their classrooms: (1) pre-lab discussions, often with a demonstration, (2) post-lab discussions where students’ questions were answered, (3) regular monitoring during lab sessions to check student understanding and engagement, and (4) receiving proper training on how to effectively use virtual labs as a teaching tool.

Love of Teaching

            Both participants expressed their love for teaching during the interview. I realize that teaching is not a get rich and/or money making profession. Mr. Jones is an environmental engineer. He has many job options, but, he choose to share his knowledge and engineering skills with the children of America. In addition, Ms. Tanisha spoke of her love of teaching very explicitly during the interview. She shared with me that she was a “people person” and loved to show students different ways to learn.

The Belief That All Children Can Learn

            The saying that all children can learn has been used in many educational articles and books. In many settings, this saying has become a cliché.  During my interview with both participants, I genuinely felt that these teachers believed what they were saying. Ms. Tanisha said “all children can learn, but, differently.” It is true that all children can learn. This is especially true when the needs of each of the student are met. For example, each child comes to class with his or her own capability, learning preferences, and world view. If these needs are not met by the teacher, some children will be left behind and deemed to be incapable of learning. Therefore, according to the participants, it is crucial to meet each individual child wherever they are and to help them to achieve success in learning. This will boost their confidence to learn.

Pre-Lab Discussions

            Since virtual labs are somewhat different than hands-on and real life experiments, it is paramount that teachers discuss the lab before students actually do the lab. This will improve student understanding by activating their prior knowledge and by making them ready to learn. Mr. Jones normally discusses the lab before students begin doing the lab. It is a good practice as it helps iron-out student misunderstandings and reduces the amount of questions that students may have during the lab session. Once students know what to do and how to do it, completing the lab becomes easier for them and they are more likely to learn from it. Therefore, Mr. Jones and Ms. Tanisha employ the pre-lab discussion to help their students understand what the lab is all about and how to complete it.

Post-Lab Discussions and Regular Monitoring during Lab Sessions

Mr. Jones discussed the use of post-lab discussions as an important tool for effective use of virtual labs with students in science. Once students have completed their virtual lab sessions, it is important to have a discussion regarding the concept or concepts covered. This is important because it helps students to consolidate what they have learned. It also helps the teacher to assess what students have learned and what topics may need further discussion. I concur with Mr. Jones’s views on this, I believe that post-lab discussions are crucial for helping student re-evaluate their understanding of the lab and also to receive confirmation regarding their understanding. Post-lab discussions also offer students the opportunity to explain and reflect on their understanding of the concept covered by the lab and to ask any clarifying questions.  In addition, both participants mentioned the importance of circulating throughout the classroom while students are completing the lab to monitor their understanding and to make sure they remain engaged in the task. This also allows students the opportunity to ask questions as they are completing the lab so that they are able to successfully complete all their assigned tasks.  Teachers can also monitor their progress and provide one-on-one guidance as needed.

The Importance of Proper Training

            Lack of proper training was one issue raised by the participants during the interviews. It is quite obvious that in the absence of training, things tend not to work as effectively as they should. This applies to virtual labs as well. Ms. Tanisha discussed a two day training she received on how to effectively use virtual labs. In my view, in-service training is needed for teachers to help them understand how to best use virtual labs in their classrooms. School districts’ tend to buy these programs with little or no emphasis placed on training teachers how to use the programs. Mr. Jones reports that he never received any formal training on how to use virtual labs. He trained himself through trial and error. Allowing teachers to train themselves on the effective use of virtual labs with their students is not a good practice. Teachers should be trained to use technology properly in order to increase student engagement and academic achievement.

Recommendations for Future Research

            This study was centered on two research participants’ practical and personal experiences with virtual labs. Six themes emerged from the interviews with research participants. The emerged themes include: (1) love of teaching, (2) the belief that all children can learn, (3) pre-lab discussions, (4) post-lab discussions, (5) regular monitoring during lab sessions, and (6) the importance of proper training. As discussed in the study results and discussion, these themes have direct implications for the effective use of virtual labs in science classrooms. In order to validate the results from this study, additional research with more teachers from different settings is needed.  For example, studies with teachers from middle school science or other high school science setting would be desirable. In addition, the questions used to capture teachers’ practical experiences with virtual labs in this study were not very focused. Therefore, studies with more focused questions on this matter are needed to capture the essence of these practical experiences. Finally, I realize that one’s cultural background influences one’s experiences.  My background, cultural experiences, and world view may have affected the way I analyzed the data. Therefore, research done by people with different cultural and background experiences are warranted.

           

           

 

 

 

 

References

Boz, Nihat., and Boz, Yezdan. (2008). A qualitative case study of prospective chemistry teacher’ knowledge about instructional strategies: Introducing particulate theory, Journal of Science Teacher Education, 19(33), 135-156.

Case, J. M., Marshall, D., & Linder, C. (2010). Being a student again: A narrative study of a teachers’ experience. Teaching in Higher Education,15(4): 423-433.

Cengiz, T. (2010). The effect of virtual laboratory on students’ achievement and altitude in Chemistry.

Clandinin, D.J., & Connelly, F. M. (2000). Narrative inguiry: Experience and story in qualitative research. San Fransisco: Jossey-Bass.

Cole, M., & Wertsch, J. V. (2004). Beyond the individual-social antimony in discussions of Piaget and Vygotsky. Accessed: www.massey.ac.nz/~alock/virtual/cplevyg.htm [October, 2013].

Connelly, F. M., & Clandinin, D. J. (1990). Stories of experience and narrative inquiry. Educational Researcher, 19(5): 2-14.

Creswell, J. W. (2011). Designing and Conducting Mixed Methods Research (2nd ed). Thousand Oaks, CA: SAGE Publications, Inc.

Creswell, J. W. (2013). Qualitative Inquiry and Research Design: Choosing Among Five Approaches (3rd ed). Thousand Oaks, CA: SAGE Publications, Inc.

Dewey, J. (1916). Democracy of education. New York: MacMillan.

Falvo, D. (2008). Animations and simulations for teaching and learning molecular chemistry. International Journal of Technology in Teaching and Learning, 4(1), 68-77.

Greene, J., Caracelli, V., & Graham, W. (1989). Toward a conceptual framework for Mixed-Methods Evaluation Designs. Educational Evaluation and Policy Analysis 11:255-274.

Gobert et al. (2011). Examining the relationship between students’ understanding of the nature of models and conceptual learning in Biology, Physics, and Chemistry, International Journal of Science Education, 33(5): 653-684.

Hofstein, A., & Lunetta, V. N. (2004). The laboratory in science education: Foundation for the 21st century. Science Education, 88, 28-54.

Hofstein, A. (2004). The laboratory in chemistry education: Thirty years of experience with developments, implementation, and research. Journal of Chemistry Education Research and Practice,5(3): 247-264.

Hofstein, A., & Mamlok-Naaman, R. (2007). The laboratory in science education: the state of the art. Journal of Chemistry Education Research and Practice, 8(2): 105-107.

Kennepohl, D. (2001). Using computer simulations to supplement teaching laboratories in chemistry for distance delivery. The Journal of Distance Education, 16(2):58-65.

Kun-Yuan, Y., and Jian-Sheng, H. (2007). The impact of internet virtual physics laboratory instruction on the achievement in physics, science process skills and computer attitudes of 10th –grade students. Journal of Science Education and Technology, 16: 451-461.

Mann, S. J. (2003). A personal inquiry into an experience of adult learning on-line.

Patton, M. Q. (2002b). Variety in qualitative inquiry: Qualitative research and evaluation methods (3rd ed.). Thousand Oaks, CA: SAGE

Pratt, K., and Sims, R. (2012). Virtual and physical experimentation in Inquiry-based science labs: Attitudes, Performance, and Acess. Journal of Science Education and Technology, 21(1), 133-147.

Smith, C., Maclin, D., Houghton, C., & Hennessy, G. (2000). Six-grade students’ epistemologies of science: The impact of school science experiences on epistemological development. Cognition and Instruction, 18(3), 349-422.

Schon, D. A. (1987). Educating the reflective practitioner. New York: Jossey-Bass.

Tatli, Z., and Ayas, A. (2013). The effect of a virtual chemistry laboratory on students’ achievement. Journal of Technology and Society, 16(1):159-170.

Van LeJeune, J. (2002). A meta-analysis of outcomes from the use of computersimulated experiments in science education. Unpublished Ed.D. dissertation, Texas A&M University.

Wittman, M., Steinberg, R., and Redish, F. (1999). Making sense of how students make sense of mechanical waves. Journal of Physics Teacher, 37, 15-21.

Curriculum Evaluation Using Tyler’s Goal Attainment Model or Objectives-Centered Model


By: Shaaban Kitindi Fundi

In this paper I will describe the Tyler model while emphasizing its evaluative component. I will use the DeKalb County Science Curriculum in my analysis. Specifically, I will use Dunwoody High School students’ outcomes data (end of course test-EOCT) for physical science and biology to evaluate the curriculum. However, before I start the evaluation, I will provide a brief overview of the Tyler model (what is it? what are its parts? and what are the criticisms of the model?) and finally I will conclude by summarizing the steps I followed to complete the evaluation.

Tyler’s goal attainment model or sometimes called the objectives-centered model is the basis for most common models in curriculum design, development and evaluation. The Tyler model is comprised of four major parts. These are: 1) defining objectives of the learning experience; 2) identifying learning activities for meeting the defined objectives; 3) organizing the learning activities for attaining the defined objectives; and 4) evaluating and assessing the learning experiences. In this paper I will most deal with the evaluation component of the model. However, before I start evaluating the science curriculum for DeKalb County, I start with a brief discussion of the Tyler model, what it is, its parts, and what it emphasizes.

The Tyler Model begins by defining the objectives of the learning experience. These objectives must have relevancy to the field of study and to the overall curriculum (Keating, 2006). Tyler’s model obtains the curriculum objectives from three sources: 1) the student, 2) the society, and 3) the subject matter. When defining the objectives of a learning experience Tyler gives emphasis on the input of students, the community and the subject content. Tyler believes that curriculum objectives that do not address the needs and interests of students, the community and the subject matter will not be the best curriculum. The second part of the Tyler’s model involves the identification of learning activities that will allow students to meet the defined objectives. To emphasis the importance of identifying learning activities that meets defined objectives, Tyler states that “the important thing is for students to discover content that is useful and meaningful to them” (Meek, 1993, p. 83). In a way Tyler is a strong supporter of the student-centered approach to learning. Overall, Tyler’s model is designed to measure the degree to which pre-defined objectives and goals have been attained. In addition, the model focus primarily on the product rather than the process for achieving the goals and objectives of the curriculum. Therefore, Tyler’s model is product focused. It evaluates the degree to which the pre-defined goals and objectives have been attained.

There are several criticisms leveled at the Tyler’s goal attainment model or the Tyler’s objective centered model. The first criticism is that, it is difficult and time consuming to construct behavioral objectives. Tyler’s model relies mainly on behavioral objectives. The objectives in Tyler’s model comes from three sources (the student, the society, and the subject matter) and all the three sources have to agree on what objectives needs to be addressed. This is a cumbersome process. Thus, it is difficult to arrive to consensus easily among the various stakeholders groups. The second criticism is that, it is too restrictive and covers a small range of student skills and knowledge. The third criticism is that Tyler’s model is too dependent on behavioral objectives and it is difficult to declare plainly in behavioral objectives the objectives that covers none specific skills such as those for critical thinking, problem solving, and the objectives related to value acquiring processes (Prideaux, 2003). The fourth and last criticism is that the objectives in the Tyler’s model are too student centered and therefore the teachers are not given any opportunity to manipulate the learning experiences as they see fit to evoke the kind of learning outcome desired.

To evaluate the DeKalb County School System Science Curriculum, I downloaded the DeKalb County physical science and biology curriculum at a glance from the districts’ website. After a careful look at the curriculum, I realized that both the biology and physical science curriculum does not fit the many definitions of a true curriculum. They are plainly instructional guides with standards, units to be covered, and the time allocation for each unit. In my understanding, a curriculum should encompasses more than a list of standards, units, and time allocations. According to Robert Gagne (1966) curriculum encompasses four categories. These categories are: 1) subject matter or content, 2) statements of end objectives or goals, 3) the sequencing of content, and finally 4) pre-assessment of skills. The DeKalb County Science Curricular for physical science and biology lack many of these features.

The Dekalb County Science Curriculum at a glance document does not appear to meet Kerr’s definition of curriculum either.  According to Kerr (1968) a curriculum is “all the learning which is planned and guided by school, whether it is carried on in groups or individually, inside or outside the school (Kerr, J. 1968, as cited in Kelly A. V. 2009, p.12). Kerr’s definition of the curriculum together with Gagne’s categories of the curriculum appears to encompass more than just the standards, the units covered, and the time allocated for each unit. In other words, a curriculum is much broader than a course syllabus or a curriculum guide and it includes both the planned and the unplanned consequences/effects of the curriculum.

In order to evaluate the biology and physical science curricular at Dunwoody High School, I created a table containing the Spring EOCT scores for the two courses. The data spans a range of three years, from spring of 2011 to spring of 2013. I will also compare Dunwoody EOCT scores with the entire DeKalb Country scores to the score summary of the whole state of Georgia.

Spring 2011 to 2013 EOCT Score Summary

Location

Biology

Physical Science

2011

% Pass

2012

% Pass

2013

% Pass

2011

% Pass

2012

% Pass

2013

% Pass

Dunwoody High School

74

85

84

78

73

72

DeKalb County

59

62

63

66

66

83

State of Georgia

70

73

75

76

78

67

 

Based on the data for the past three years, Dunwoody High School has been meeting the science curricular objectives in physical science and biology. The year per year pass rates for biology are much higher at Dunwoody High School than they are for the rest of the county. For example, in 2011, Dunwoody High School students pass rate for the EOCT biology was 12% higher than the rest of the DeKalb County average. In the same year, Dunwoody High School outperformed the state average percentage pass rates by 4%.  Thus, the biology percentage pass rates appear to be much higher at Dunwoody High School when compared to the state’s EOCT percent pass averages for the same period.

Similar trends are observed in the percent pass rates of the EOCT scores in physical science. Dunwoody High school met the physical science curriculum objectives in the year 2011 to 2013. Dunwoody High School data shows a higher percent pass rate when compared to the average percent pass rates for the Dekalb County. For example, in the year 2012, Dunwoody High School had 10% more of its students pass the EOCT in physical science than the average percent pass rate of the entire county. However, Dunwoody High School percent pass rates for physical science was 5% lower than the average percent pass rate of the state of Georgia. Overall, Dunwoody High School appears to meet the science curricular objectives in both biology and physical science. I would recommend Dunwoody High School to set yearly improvement goals that will help the school to increase the percent pass rates in physical science from the mid 70’s to high 80’s in the next three years. In addition, I would recommend Dunwoody High School to set yearly improvement goals for biology from the mid 80’s to the mid 90’s in the next three years.

In summary, I decided to evaluate the DeKalb County science curricular because of my interest in understanding the county’s science curriculum. I chose two courses (biology and physical science) that have the End of Course Tests to accomplish the evaluation task. In evaluating the two subjects, I chose to use achievement test data (EOCT). The EOCT data are readily available in the County website. In this evaluation, I chose the traditional perspective. The traditional perspective allowed me to look at the objectives, the data, and compare those to the percent pass scores of the students for my school, the county, and the state. This allowed me to determine whether the objectives were met or not and if they were met, by what degree?

I visited Dunwoody High School on numerous occasions during this exercise. I believe technology plays a bigger role on how students learn and achieve on the achievement tests. Dunwoody High School has three fully functional computer labs. In addition, the science department has  one hundred STEM LAB laptops, fifty IPADs, and eighty hand-held student response systems which provides opportunities for students to practice what they have learning in class using gizmos and virtual labs. We also use the flipped classroom model. I believe the combination of in class instruction and virtual classrooms help our students to meet the science curricular objectives for the county and the state.

 

 

Reference

Gagne, R. W. (1967). Curriculum research and the pro- motion of learning. In R. W. Tyler, R. M. Gagne, & M. Scriven (Eds.), Perspectives of curricular evaluation. Chicago: Rand McNally.

Prideaux, D. (2003). Curriculum design: ABC of learning and teaching in medicine. British Medical Journal, 326(7383): 268-270.

Meek, A. (1993). On setting the highest standards: A conversation with Ralph Tyler. Educational Leadership, 50, 83-86.

Kerr, J. F. (1968). The problem of curriculum reform, in John F. Kerr (Ed.), changing the curriculum. London: University of London Press.

Keating, S. (2006). Curriculum development and evaluation in nursing. Philadelphia, Pennsylvania: Lippincott Williams & Wilkins.