Title of the Article: ChemVLab+: Evaluating a Lab Tutor for High School Chemistry
In the article entitled, “ChemLab+: evaluating a virtual lab tutor for high school chemistry” by Devenport and colleagues., the authors argue that teaching high school chemistry typically involves quantitative problem solving activities with the assumption that students will learn core concepts through manipulation of numbers and symbols. Another assumption is that students who are able to successfully perform complex calculations have mastered these core concepts and this mastery reflects conceptual understanding. Research in chemistry education, however, questions these assumptions. For example, it is unclear if quantitative ability is an indication of conceptual understanding and even high achieving students may lack basic knowledge of core principles.
In their article, Devenport et al., provide several examples to illustrate the lack of validity in the assumption that quantitative ability reflects conceptual understanding. In their first example, they cite a study by Smith & Metz, (1996) which found that students performed well in traditional acid/base assessment using quantitative assessment methods, but failed to identify strong versus weak acids when shown examples in diagrams and/or graphic forms. They argue that this example indicates “that definition terms were used without true comprehension of the concept”.
In addition, the authors argued that the current emphasis on algorithmic problem solving does not adequately prepare students with the conceptual understanding they need to reason in chemistry. To support this view, they use a study by Nakhleh and Mitchel (1993), which found that “when students are given both algorithmic and conceptual items paired for identical concepts, more students were successful on solving algorithmic items rather than conceptual items”. In this study, half of students with high algorithmic performance had low conceptual performance indicating difficulty connecting the mathematical representations with the underlying chemistry concepts. From this study, the authors conclude that the “current emphasis on algorithmic problem solving does not prepare students well with the conceptual understanding needed to reason properly in the world of chemistry”.
Due to the mounting evidence (e.g., Bodner & Herron, 2002; Gabel & Bunce, 1994; Nakhleh & Mitchel, 1993; Smith and Metz, 1996) discrediting the assumption that quantitative abilities reflect conceptual understanding, the authors of this study designed an experiment to test an intervention aimed at improving chemistry students’ conceptual knowledge in addition to their quantitative skills. The intervention, ChemCollective Virtual Lab, engages students in meaningful problem solving of complex chemistry concepts to improve their conceptual understanding of core concepts. The authors employ a mixed-methods approach involving classroom observations (the student engagement aspects), pretests and posttests (cognitive and conceptual achievement of both quantitative and conceptual skills), log-file analyses (an instrument to analyze learning as it occurs through repeated student learning growth), and teacher interviews (soliciting input from teachers on what worked and what needed improvement) to evaluate the effectiveness of ChemCollective Virtual Lab.
Strength of the Article
The authors provide a strong justification for their assertion that quantitative ability does not necessarily indicate conceptual understanding of core concepts in chemistry by providing several examples from the literature. Conceptual learning, the authors argue, can only be achieved through authentic manipulation of real world examples, informed negotiation, short-term feedback, and live tutoring. They test this hypothesis by evaluating a chemistry teaching tool they developed, ChemCollective Virtual Labs, which includes exercises to improve both quantitative skills and conceptual learning, the two skills necessary to master chemistry. Through ChemCollective Virtual Labs, students have the opportunity to apply chemistry knowledge to real world examples and receive immediate, individualized feedback
while the system estimates their proficiency in understanding core concepts. The results of the mixed methods evaluation suggest that students were actively engaged with the tool and that they improved their understanding of chemistry. Teachers also found the activities to be worthwhile.
Overall, the authors make a strong case to discredit the assumption that quantitative ability reflects conceptual mastery in chemistry. Their argument is further strengthened by providing evidence of the effectiveness of the ChemCollective Virtual Lab at improving students’ understanding of chemistry by focusing on both quantitative skills and conceptual learning. This article is a good example of how to develop an intervention based on an identified gap in the literature, to test that intervention using a rigorous evaluation, and report the results of the evaluation in a way that can be useful to other educators and researchers.
Weakness of the Article
While I agree mostly with the authors’ argument that quantitative skills do not necessarily reflect conceptual understanding, I question their assertion that virtual tutoring alone can lead to sustained student motivation and engagement over long periods of time. I feel as though a teacher’s role in motivating, monitoring, and explaining the activity and what students need to get from the activity is also important and is insufficiently addressed in this article.
I also question the assertion that computer tutoring alone can improve students’ conceptual understanding of chemistry concepts. Students misunderstanding of key chemistry concepts often arise from deeply held beliefs that they have developed over a long period of time. A single lesson from a computer with simple explanations may not be able to sufficiently address and correct these misconceptions. Teachers, through ongoing observation and engagement with students, can identify and correct these misconceptions. Therefore, while I value the ChemCollective Virtual Lab as a teaching tool, I do not believe that it is a substitute for quality teaching. Without teacher input and engagement, I do not believe that ChemCollective Virtual Lab and tools like it will be successful in the long run. Thus, while ChemCollective Virtual Lab may be an important tool in my arsenal for teaching chemistry, it can’t be the only tool.
Since I am interested in evaluating the effectiveness of virtual labs in improving students’ understanding of chemistry concepts for my own thesis, I found this article to be very useful for several reasons:
1. It has provided me with some insights on how I should approach my literature review and in the writing of my conceptual framework. I have discovered that searching the reference lists of relevant articles can help me discover articles directly related to my thesis.
2. I also liked how they used a mixed methods approach including: a) student engagement, b) pretests and posttests, c) and teacher interviews to evaluate their intervention. In my own study, I intend to look at engagement, student achievement on virtual labs versus paper and pencil instructional approaches, and student perceptions to compare the two teaching methods.
3. This article also helped me think about my data analysis plan. The authors used a paired-samples t-test to compare student pre and posttest scores. I may use a similar approach in my thesis. Therefore, I have enhanced my understanding of various approaches for data analysis that I might employ in my own thesis study.
Devenport, J. L., Rafferty, A., Timms, M. J., Yaron, D., & Karabinos, M. (2012). ChemLab+: Evaluating a virtual lab tutor for high school chemistry. The Proceedings of the 2012 International Conference of the Learning Sciences.