Usable Knowledge is a trusted source of insight into what works in education — translating new research into easy-to-use stories and strategies for teachers, parents, K-12 leaders, higher ed professionals, and policymakers. Usable Knowledge is produced at the Harvard Graduate School of Education by Bari Walsh (senior editor) and Leah Shafer (staff writer). Contact us at email@example.com.
Developing Flexibility in Math Problem Solving
To solve math problems accurately and efficiently, students need to learn multiple strategies as well as how to choose among them
A central claim of the current reform movement in mathematics education is that students benefit from comparing and contrasting multiple solution methods. Cognitive science research supports the value of using comparison and contrast to promote general learning: identifying similarities and differences in multiple examples has proven to be a critical and fundamental pathway to flexible, transferable knowledge.
However, few experimental studies have been conducted to demonstrate the value of this approach in math classrooms. Research in mathematics education shows the benefits of a variety of practices advocated by reformers, but we don't know which of these practices are the most effective for student learning.
In his current research, Harvard Graduate School of Education Associate Professor Jon Star, who is also a former middle and high school math teacher, is examining the value of comparing, reflecting on, and discussing multiple solution methods. In a series of experimental studies in middle school classrooms, Star has found that comparing and contrasting solution methods — as opposed to studying one method at a time — does in fact promote greater learning.
Star and his team of researchers traveled to a private, urban school in Tennessee, where they spent four days in seventh-grade mathematics classrooms, specifically to evaluate the effectiveness of comparing multiple solution methods on learning to solve linear equations. A total of 70 students (36 girls, 34 boys) participated in the study, in two regular and two advanced classes.
Star hypothesized that this approach would promote three critical components of mathematical competence: procedural knowledge; procedural flexibility, the abilities to generate, recognize, and evaluate multiple solution methods for the same problem; and conceptual knowledge, students' verbal and nonverbal knowledge of algebra concepts, such as maintaining equivalence, and the meaning of variables.
“We decided to pull out this one practice in math and subject it to more rigorous testing,” says Star. “Is there a benefit to contrasting and comparing multiple examples? We wanted to demonstrate that this is a better method.”
On the first day of the study, students were given 30 minutes to complete a pretest. Then, for the next two days, Star's experimental curriculum replaced the normal materials on solving linear equations. A member of the team or the teacher began the class each day by conducting a ten-minute lesson, and then students worked with a partner on a packet of algebra problems for the remainder of the period.
Students were randomly assigned to a partner, and then partners were randomly assigned to one of two groups. The “compare” group was given a packet of 12 equations, each solved in two different ways, with the solutions side by side on the same page. These worked examples typically illustrated a conventional method for solving an equation, and then a shortcut method that reduced the number of computations and steps needed to solve the equation.
The other group studied sequentially presented solutions. In their packets, the same solution methods were presented as in the compare condition, but each worked example was shown on a separate sheet — one per page. (View a sample of the worked examples in the “compare” and “sequential” packets.)
When studying the worked examples, students had to describe each solution to their partner and answer accompanying questions, first verbally and then in writing. Tape recorders were placed on the desks to record the students' discussions. “We were interested in the process by which they were making sense of it all,” says Star, “and comparing what they were saying to how these methods work.”
To assess the effectiveness of the compare presentation as opposed to the sequential, Star gave the students a posttest on the fourth day. The teacher first provided a brief summary lesson, and then students were given 30 minutes to complete the posttest, which was identical to the pretest.
The team found that comparing and contrasting alternative solution methods led to greater gains in procedural knowledge and flexibility, and comparable gains in conceptual knowledge, as opposed to studying multiple methods sequentially.
- Procedural knowledge: Comparison helped students become better at solving linear equations. This effect was found on problems similar to those in the study packets, and on transfer problems which differed from those the students had already seen.
- Flexibility: The students in the compare group were able to generate multiple solutions to the same problem. These students were also more likely to use the demonstrated shortcuts during the posttest.
- Conceptual knowledge: The compare and sequential groups did not differ in their knowledge of the big ideas of equation solving — such as the concept of equivalence — but students in both groups showed improvements from pretest to posttest.
Further evidence of the benefits of comparison emerged from the process data Star collected. In their written explanations, students in the compare group almost always referenced multiple methods, focused on the solution method, and judged the efficiency or accuracy of the methods. In contrast, those in the sequential groups were much less likely to do so.
“We found that the two examples per page format, combined with questions that asked the student to look at similarities and differences between the two, had a big impact on students' ability to solve math problems correctly, and on their ability to use multiple strategies,” says Star. “Both groups saw the same strategies, but the student who saw them side by side remembered how to use them, and then used them more later on.”
Star and his team have replicated and extended these findings in several additional studies, conducted in public and private schools in Michigan, Tennessee, and Massachusetts. However, this research has also raised a number of interesting questions that Star continues to explore. For example, in a math classroom there are many different things that can be compared: teachers could compare the same problem, solved in two different ways (as was done in Tennessee study); two different problems, each solved the same way; or two very similar problems, each solved the same way (as is typically seen in math texts).
Is one of these types of comparison optimal for student learning? Star's investigation into the role of comparison in students' learning of math continues — with the goal of discovering how teachers can learn to harness the power of comparison most effectively in their math classrooms.
For more information on this research:
Rittle-Johnson, B, & Star, J.R. (2007). Does comparing solution methods facilitate conceptual and procedural knowledge? An experimental study on learning to solve equations. Journal of Educational Psychology, 99(3), 561-574.