Menu Harvard Graduate School of Education

Heather C. Hill

Jerome T. Murphy Professor in Education
Heather C. Hill

Degree:  Ph.D., University of Michigan, (2000)
Email:  [javascript protected email address]
Phone:  617.495.1898
Vitae/CV:   Heather C. Hill.pdf
Office:  Gutman 445
Faculty Assistant:  Cindy Floyd


Heather C. Hill is a professor in education at the Harvard Graduate School of Education (HGSE). Her primary work focuses on teacher and teaching quality, and the effects of policies aimed at improving both. She is also known for developing instruments for measuring teachers' mathematical knowledge for teaching (MKT) and the mathematical quality of instruction (MQI) within classrooms. Hill is co-director of the National Center for Teacher Effectiveness and also principal investigator of a five-year study examining the effects of Marilyn Burns Math Solutions professional development on teaching and learning. Her other interests include knowledge use within the public sector and the role that language plays in the implementation of public policy. She has served as section chairs for the American Educational Research Association and Society for Research on Educational Effectiveness conferences, and on the editorial boards of Journal of Research in Mathematics Education and the American Educational Research Journal. She is the co-author, with David K. Cohen, of Learning policy: When state education reform works (Yale Press, 2001).

Click here to see a full list of Heather Hill's courses.

Areas of Expertise
Sponsored Projects

Strengthening the Research Base that Informs STEM Workforce and Curriculum Improvement Efforts (2014-2016)
National Science Foundation

Survey of U.S. Middle School Mathematics Teachers and Teaching (2014-2018)
National Science Foundation

Center for the Study of Interactive Knowledge Utilization (2014-2019)

Developing Common Core Classrooms Through Rubric-Based Coaching (2014-2017)
National Science Foundation

Exploring Methods for Improving Teachers' Mathematical Quality of Instruction (2012-2015)
National Science Center

In this exploratory study, we will develop and test professional development aligned with the Mathematical Quality of Instruction observational instrument. In the program, teachers will learn to score videotapes of instruction using the MQI framework first by completing online training and then through participating in a series of groups meetings with others at their school. These meetings will combine aspects of video clubs (Borko, Jacobs, Eiteljorg & Pittman, 2008; van Es and Sherin, 2006) and lesson analysis (Santagata & Angelici,
2010), using the MQI framework to view and score portions of lessons. A comparison group will enable us to understand the effects of the MQI on several measures, including teachers’ talk,
their analysis of videotapes, and reflections on their own lessons. Owing to cost constraints, we will leave the collection of data on instructional or student outcomes to a future study. However, we believe that the outcomes of interest noted above will provide adequate proxies to practice, in that they have been linked to practice or are precursors to the improvement of practice.

In addition, we will systematically vary the conditions under which the MQI-based professional development is delivered in order to gain knowledge about the best design for a scaled-up implementation. To do so, we will expose fifteen groups of fourth and fifth grade teachers to MQI-based lesson analysis. These fifteen groups will be randomly assigned into five conditions (three groups per condition), with each group engaging in weekly or bi-weekly lesson analysis using the MQI. However, the amount of facilitator intervention (less vs. more), the origin of the videotape (teachers’ own video vs. video from a library), and the delivery method (face-to-face vs. internet) will vary by condition.

Impact Evaluation of Math Professional Development (2012-2016)
U.S. Department of Education

Investigating the Effects of Professional Development, Mathematical Knowledge for Teaching, and Instruction on Student Outcomes (2009-2014)
National Science Foundation

In this research, the research team sheds light on twin problems facing efforts to improve teacher quality in mathematics. The first problem is theoretical: What mathematical knowledge do teachers need to effectively instruct children? Does teacher basic, advanced, or profession-specific knowledge (e.g, mathematical knowledge for teaching, or MKT) matter most to student outcomes? The second problem is more practical: Can a particular professional development program, Math Solutions, improve teachers’ mathematical knowledge for teaching, their instruction, and student outcomes? To address these twin problems, the researchers conduct a cluster randomized trial to examine the efficacy of Math Solutions on elementary teachers’ MKT, their instruction, and their students’ learning. We chose to embed the study of teacher MKT in an evaluation of professional development because it is not possible to randomly assign teachers to different levels of MKT. Instead, the tearchers are randomly assigned to Math Solutions with the goal of examining the contribution of changes in teachers’ MKT, as it develops in and through the professional development, to instruction and student achievement. Data collection takes place in the Albuquerque Public Schools over three years and involves 80 fourth and fifth grade teachers in approximately 12 schools. Half the eligible teachers are randomly assigned to the mathematics professional development and half receive coaching in another subject. Longitudinal data is collected on teachers’ mathematical knowledge, instruction, and their student gains, allowing us to examine the effect of Math Solutions on these outcomes.

National Center for Teacher Effectiveness: Validating Measures of Effective Math Teaching (2009-2014)
U.S. Department of Education, Institute of Education Sciences

Topic: Education Research and Development Centers: Teacher EffectivenessFocus: Mathematics Instruction, Grades 4 through 6Overview: The Harvard Graduate School of Education (HGSE) is well-placed to combine the disparate strands of research and accellerate the search for a valid, scalable measure of teacher effectiveness. The proposed center will have four primary goals: 1. Unify three disparate strands of research: “value-added” modelling, the direct study of math instruction, and non-instructional predictors of teacher effectiveness (e.g. assessments of teacher knowledge, student evaluations). The team of investigators includes leaders from all three areas.2. Develop an empirically-derived composite measure of teacher effectiveness, which efficiently combines estimated impacts on student achievement, ratings of pedagogical practice on both general and content-specific observation protocols, tests of teachers’ mathematical knowledge and student evaluations. 3. Operationalize the measures to ensure that they are usable in the field, with training videos for principals, test items for teacher assessments, student evaluation forms, etc. 4. Externally validate the estimated differences in teacher effectiveness by randomly assigning classroom rosters within teams of teachers working in the same grades and schools, comparing their students’ achievement gains. Partner Districts: District of Columbia and Charlotte-Mecklenburg Public SchoolsIdentification Study: Over two years, we will administer a student assessment in the fall and spring; collect 4 video observations per teacher in the study each year; rate those videos using both general and content-specific rubrics; assess teachers with the test of Mathematical Knowledge for Teaching; administer student evaluations. Using a novel empirical approach, we will form an Empirical bayes composite measure, efficiently combining all of the above in a manner which best predicts student achievement gains in a non-experimental setting.Validation Study: In a third year of data collection, we will randomly assign classrooms of students to teachers working in the same grades and schools. We will repeat the video collection and ask principals, using the training materials provided by the research team, to rate their teachers’ instruction. Using those videos, we will confirm that the principals’ ratings of instruction were consistent with the researcher ratings. We will also test whether the predicted differences in student achievement using the composite measures were borne out after random assignment. If not, we will use the student outcomes following random assignment to recalibrate the composite measure.


Hill, H.C. & Grossman, P. (2013). Learning from Teacher Evaluations: Challenges & Opportunities. Harvard Education Review 83, 371-384.,(2013)

Hill, H. C., Beisiegel, M., & Jacob, R. (2013). Professional Development Research Consensus, Crossroads, and Challenges. Educational Researcher, 42(9), 476-487.,(2013)

Hill, H.C. & Grossman, P. (2013). Learning from Teacher Evaluations: Challenges & Opportunities. Harvard Education Review 83, 371-384.,(2013)

Hill, H.C. (in press). The nature and effects of middle school mathematics teacher learning experiences. Teachers’ College Record.,(2012)

Hill, H.C., Umland, K. L. & Kapitula, L.R. (2011). A validity argument approach to evaluating value-added scores. American Educational Research Journal.,(2011)

Hill. H.C. (2010). The nature and predictors of elementary teachers’ Mathematical Knowledge for Teaching. Journal for Research in Mathematics Education, 41 (5), 513-545.,(2010)

Hill, H.C. & Shih, J. (2009) Research commentary: Examining the quality of statistical mathematics education research. Journal for Research in Mathematics Education.,(2009)

Hill, H.C. & Ball, D.L. (2009) The curious — and crucial — case of Mathematical Knowledge for Teaching. Phi Delta Kappan , 91, 68-71.,(2009)

Hill, H.C. (2009). Evaluating value-added models: A measurement perspective. Journal of Policy and Management, 28, 702-209.,(2009)

Delaney. S. F., Ball, D. L., Hill, H. C., Schilling, S.G., & Zopf, D. A. (2008). Adapting U.S. measures of “Mathematical Knowledge for Teaching” for use in Ireland. Under review at Journal of Mathematics Teacher Education,11, 171-197..,(2008)

Hill, H.C., Blunk, M. Charalambous, C., Lewis, J., Phelps, G. C. Sleep, L. & Ball, D.L. (2008). Mathematical Knowledge for Teaching and the Mathematical Quality of Instruction: An Exploratory Study. Cognition and Instruction, 26, 430-511.,(2008)

Hill, H.C., Ball, D.L. & Schilling, S.G. (2008) Unpacking “Pedagogical Content Knowledge”: Journal for Research in Mathematics Education,39, 372-400.,(2008)

Schilling, S.G. & Hill, H.C. (2007). Assessing Measures of Mathematical Knowledge for Teaching: A Validity Argument Approach. Measurement: Interdisciplinary Research and Perspectives (5), 2-3, 70-80.,(2007)

Hill, H.C., Ball, D.L., Blunk, M. Goffney, I.M. & Rowan, B. (2007). Validating the ecological assumption: The relationship of measure scores to classroom teaching and student learning. Measurement: Interdisciplinary Research and Perspectives (5), 2-3, 107-117.,(2007)

Hill, H.C., Dean, C. & Goffney, I.M. (2007). Assessing Elemental and Structural Validity: Data from Teachers, Non-teachers, and Mathematicians. Measurement: Interdisciplinary Research and Perspectives (5), 2-3, 81-92.,(2007)

Schilling, S.G., Blunk, M. & Hill, H.C. (2007). Test Validation and the MKT Measures: Generalizations and Conclusions. Measurement: Interdisciplinary Research and Perspectives (5), 2-3, 118-127.,(2007)

Hill, H.C., Ball, D.L., Sleep, L. & Lewis, J.M. (2007) Assessing Teachers’ Mathematical Knowledge: What Knowledge Matters and What Evidence Counts? In F. Lester (Ed.), Handbook for Research on Mathematics Education (2nd ed), p. 111-155. Charlotte, NC: Information Age Publishing.,(2007)

Hill, H.C. (2007). Mathematical knowledge of middle school teachers: Implications for the No Child Left Behind Policy initiative. Educational Evaluation and Policy Analysis (29), 95-114.,(2007)

Hill, H.C. (2007). Teachers’ Ongoing Learning: Evidence from Research and Practice. The Future of Children, 17, 111-128.,(2007)

Hill, H.C. & Lubienski, S.T. (2007) Teachers’ mathematics knowledge for teaching and school context: A study of California teachers. Educational Policy 21(5), 747-768.,(2007)

Learning Mathematics for Teaching. (2006). A Coding rubric for measuring the Quality of Mathematics in Instruction. Ann Arbor, MI: Authors.,(2006)

Hill, H.C. (2006) Language matters: How characteristics of languagecomplicate policy implementation. In M.I. Honig (Ed.), New directions in education policy implementation: Confronting complexity. Albany, NY: SUNY Press.,(2006)

Hill, H.C. (2005). Content across communities: Validating measures of elementary mathematics instruction. Educational Policy 19, 447-475.,(2005)

Hill, H.C., Rowan, B., & Ball, D.L. (2005) Effects of teachers' mathematical knowledge for teaching on student achievement. American Educational Research Journal 42, 371-406.,(2005)

Ball, D.L., Hill, H.C. & Bass, H. (2005) Knowing mathematics for teaching: Who knows mathematics well enough to teach third grade, and how can we decide? American Educator, Fall 2005, 14-22.,(2005)

Hill, H.C., Schilling, S.G., & Ball, D.L. (2004) Developing measures of teachers’ mathematics knowledge for teaching. Elementary School Journal 105, 11-30.,(2004)

Hill, H. C. (2004) Professional development standards and practices in elementary school mathematics. Elementary School Journal 104, 215-31.,(2004)

Hill, H. C. & Ball, D. L. (2004) Learning mathematics for teaching: Results from California’s Mathematics Professional Development Institutes. Journal of Research in Mathematics Education 35, 330-351.,(2004)

Hill, H.C. (2003) Understanding implementation: Street-level bureaucrats’ resources for reform. Journal of Public Administration Research and Theory 13, 265-282.,(2003)

Hill, H .C. (2001) Policy is not enough: Language and the interpretation of state standards. American Educational Research Journal 38, 289-320.,(2001)

Cohen, D.K. & Hill, H.C. (2001) Learning policy: When state education reform works. New Haven, CT: Yale University Press.,(2001)

Cohen D. K. & Hill, H.C. (2000) Instructional policy and classroom performance: The mathematics reform in California. Teachers College Record 102, 296-345.,(2000)

News Stories
Professional Education Programs

Campaign Banner

Learn to Change the World

The Campaign for Harvard Graduate School of Education enables HGSE to fulfill its vision of changing the world through education by expanding opportunity and improving outcomes.

Learn More