Hazen-Nicoli Professor in Teacher Learning and Practice
Faculty Co-Chair, Teaching and Teacher Leadership (TTL)

Degree: Ph.D., University of Michigan, (2000)
Email: [javascript protected email address]
Phone: 617.495.1898
Vitae/CV: Heather C. Hill.pdf
Office: Gutman 445
Faculty Coordinator: Natalie J Solomon
Profile
Heather Hill studies policies and programs designed to improve teacher and teaching quality. Her recent research focuses on teacher professional development; instructional coaching; teacher evaluation; changes over time in teachers’ mathematical knowledge and instructional quality in mathematics; and the teacher experiences and characteristics that lead to high-quality instruction and stronger student outcomes. Hill and her team have developed assessments that capture teachers’ mathematical knowledge for teaching and teachers’ mathematical quality of instruction, assessments now widely available to researchers, instructional coaches, evaluators, and policy-makers via online training and administration systems. Hill is a fellow of the American Educational Research Association (AERA), serves on the editorial boards of several journals, and is an advisor to numerous research projects and policy efforts in both the U.S. and abroad. She is co-author of Learning Policy: When State Education Reform Works (2001) with David K. Cohen.
Click here to see a full list of Heather Hill's courses.
Reconstructing Research in Teacher Education to Provide Usable Knowledge and Support Teacher Education Improvement (2019-2022)
National Science Foundation
Overview. Over the past decade, scholars have developed promising new practices for STEM teacher education. These new practices, however, have not been accompanied by advances toward more rigorous evaluative research, leaving practitioners and policymakers with little evidence to assess the promise of these new approaches. We trace this problem to three underlying factors. The first relates to teacher education’s most prevalent research designs, which generally do not feature comparisons that enable the identification of causal impacts. Second, very few studies assess teaching outcomes that result from the piloted approach or program, perhaps because very few standardized measures of K-12 STEM teacher skills exist. Finally, there is little social and informational infrastructure, so to speak, from which teacher educators can draw to improve their research practice. These three issues – current research norms, a lack of standardized measures of skills and practice, and poor infrastructure for improvement – prevent the field from identifying, improving, and scaling innovative approaches effectively.
The Mathematical Knowledge for Teaching Measures: Refreshing the Item Pool (2016-2019)
National Science Foundation
This study describes an assessment strand late-stage design project that focuses on improving existing measures of teachers’ mathematical knowledge for teaching. Original measure development, which occurred at the University of Michigan during the period 2002-2010, had several goals: to identify the knowledge useful to teachers’ work with students and to explore the possibility that this knowledge is unique to teaching; to provide a set of measurement instruments that could be used in research on teachers’ knowledge; and to provide evaluators with an easy-to-use online administration and scoring system. These efforts resulted in widely disseminated instruments, numerous academic papers, and theoretical progress regarding the mathematical knowledge teachers use in their work. We now seek to update these measures. One reason is their wide use: anecdotal evidence suggests that up to 25% of our target teacher population may have taken a version of these instruments in pre-service training or in-service professional development. To respond to this issue, we will create over 300 items and 10 new sets of parallel forms in the most frequently tested grades and topics. Another reason to update the measures relates to new mathematics content and practice standards; a review of existing forms suggests we could better align our item pools to this key instructional guidance. We also seek to respond to a variety of user requests, and to also make our online delivery system, the Teacher Knowledge Assessment System (TKAS), more flexible in both the item formats it can accommodate and in responding to additional form updates.
Hill, H.C. & Grossman, P. (2013). Learning from Teacher Evaluations: Challenges & Opportunities. Harvard Education Review 83, 371-384.
Hill, H. C., Beisiegel, M., & Jacob, R. (2013). Professional Development Research Consensus, Crossroads, and Challenges. Educational Researcher, 42(9), 476-487.
Hill, H.C. & Grossman, P. (2013). Learning from Teacher Evaluations: Challenges & Opportunities. Harvard Education Review 83, 371-384.
Hill, H.C. (in press). The nature and effects of middle school mathematics teacher learning experiences. Teachers’ College Record.
Hill, H.C., Umland, K. L. & Kapitula, L.R. (2011). A validity argument approach to evaluating value-added scores. American Educational Research Journal.
Hill. H.C. (2010). The nature and predictors of elementary teachers’ Mathematical Knowledge for Teaching. Journal for Research in Mathematics Education, 41 (5), 513-545.
Hill, H.C. (2009). Evaluating value-added models: A measurement perspective. Journal of Policy and Management, 28, 702-209.
Hill, H.C. & Ball, D.L. (2009) The curious — and crucial — case of Mathematical Knowledge for Teaching. Phi Delta Kappan , 91, 68-71.
Hill, H.C. & Shih, J. (2009) Research commentary: Examining the quality of statistical mathematics education research. Journal for Research in Mathematics Education.
Delaney. S. F., Ball, D. L., Hill, H. C., Schilling, S.G., & Zopf, D. A. (2008). Adapting U.S. measures of “Mathematical Knowledge for Teaching” for use in Ireland. Under review at Journal of Mathematics Teacher Education,11, 171-197..
Hill, H.C., Ball, D.L. & Schilling, S.G. (2008) Unpacking “Pedagogical Content Knowledge”: Journal for Research in Mathematics Education,39, 372-400.
Hill, H.C., Blunk, M. Charalambous, C., Lewis, J., Phelps, G. C. Sleep, L. & Ball, D.L. (2008). Mathematical Knowledge for Teaching and the Mathematical Quality of Instruction: An Exploratory Study. Cognition and Instruction, 26, 430-511.
Hill, H.C., Ball, D.L., Sleep, L. & Lewis, J.M. (2007) Assessing Teachers’ Mathematical Knowledge: What Knowledge Matters and What Evidence Counts? In F. Lester (Ed.), Handbook for Research on Mathematics Education (2nd ed), p. 111-155. Charlotte, NC: Information Age Publishing.
Hill, H.C. (2007). Mathematical knowledge of middle school teachers: Implications for the No Child Left Behind Policy initiative. Educational Evaluation and Policy Analysis (29), 95-114.
Hill, H.C. (2007). Teachers’ Ongoing Learning: Evidence from Research and Practice. The Future of Children, 17, 111-128.
Hill, H.C. & Lubienski, S.T. (2007) Teachers’ mathematics knowledge for teaching and school context: A study of California teachers. Educational Policy 21(5), 747-768.
Schilling, S.G., Blunk, M. & Hill, H.C. (2007). Test Validation and the MKT Measures: Generalizations and Conclusions. Measurement: Interdisciplinary Research and Perspectives (5), 2-3, 118-127.
Schilling, S.G. & Hill, H.C. (2007). Assessing Measures of Mathematical Knowledge for Teaching: A Validity Argument Approach. Measurement: Interdisciplinary Research and Perspectives (5), 2-3, 70-80.
Hill, H.C., Ball, D.L., Blunk, M. Goffney, I.M. & Rowan, B. (2007). Validating the ecological assumption: The relationship of measure scores to classroom teaching and student learning. Measurement: Interdisciplinary Research and Perspectives (5), 2-3, 107-117.
Hill, H.C., Dean, C. & Goffney, I.M. (2007). Assessing Elemental and Structural Validity: Data from Teachers, Non-teachers, and Mathematicians. Measurement: Interdisciplinary Research and Perspectives (5), 2-3, 81-92.
Learning Mathematics for Teaching. (2006). A Coding rubric for measuring the Quality of Mathematics in Instruction. Ann Arbor, MI: Authors.
Hill, H.C. (2006) Language matters: How characteristics of languagecomplicate policy implementation. In M.I. Honig (Ed.), New directions in education policy implementation: Confronting complexity. Albany, NY: SUNY Press.
Hill, H.C. (2005). Content across communities: Validating measures of elementary mathematics instruction. Educational Policy 19, 447-475.
Hill, H.C., Rowan, B., & Ball, D.L. (2005) Effects of teachers' mathematical knowledge for teaching on student achievement. American Educational Research Journal 42, 371-406.
Ball, D.L., Hill, H.C. & Bass, H. (2005) Knowing mathematics for teaching: Who knows mathematics well enough to teach third grade, and how can we decide? American Educator, Fall 2005, 14-22.
Hill, H. C. (2004) Professional development standards and practices in elementary school mathematics. Elementary School Journal 104, 215-31.
Hill, H. C. & Ball, D. L. (2004) Learning mathematics for teaching: Results from California’s Mathematics Professional Development Institutes. Journal of Research in Mathematics Education 35, 330-351.
Hill, H.C., Schilling, S.G., & Ball, D.L. (2004) Developing measures of teachers’ mathematics knowledge for teaching. Elementary School Journal 105, 11-30.
Hill, H.C. (2003) Understanding implementation: Street-level bureaucrats’ resources for reform. Journal of Public Administration Research and Theory 13, 265-282.
Cohen, D.K. & Hill, H.C. (2001) Learning policy: When state education reform works. New Haven, CT: Yale University Press.
Hill, H .C. (2001) Policy is not enough: Language and the interpretation of state standards. American Educational Research Journal 38, 289-320.
Cohen D. K. & Hill, H.C. (2000) Instructional policy and classroom performance: The mathematics reform in California. Teachers College Record 102, 296-345.