Information For:

Give back to HGSE and support the next generation of passionate educators and innovative leaders.

Faculty & Research

Thomas Kane

Walter H. Gale Professor of Education and Economics

Thomas Kane

Degree:  Ph.D., Harvard University, (1991)
Email:  [javascript protected email address]
Phone:  617.496.4359
Fax:  617.495.2614
Vitae/CV:   Thomas Kane.pdf
Office:  50 Church Street 4th Floor
Faculty Assistant:  Sheila Griffin

Profile

Thomas Kane is an economist and Walter H. Gale Professor of Education at the Harvard Graduate School of Education. He is faculty director of the Center for Education Policy Research, a university-wide research center that works with school districts and state agencies. Between 2009 and 2012, he directed the Measures of Effective Teaching project for the Bill & Melinda Gates Foundation. His work has spanned both K-12 and higher education, covering topics such as the design of school accountability systems, teacher recruitment and retention, financial aid for college, race-conscious college admissions and the earnings impacts of community colleges. From 1995 to 1996, Kane served as the senior economist for labor, education, and welfare policy issues within President Clinton's Council of Economic Advisers. From 1991 through 2000, he was a faculty member at the Kennedy School of Government. Kane has also been a professor of public policy at UCLA and has held visiting fellowships at the Brookings Institution and the Hoover Institution at Stanford University.

Click here to see a full list of Thomas Kane's courses.

Areas of Expertise
Awards

Professor of the Year, UCLA School of Public Affairs, Public Policy Department,(2006)

Sponsored Projects

 

Monitoring COVID Catch-Up (2021-2022)
Walton Family Foundation

Given the learning losses that U.S. students have experienced during the pandemic, the 2021-22 school year may be the most important year in our nation’s history. The federal government has provided nearly $200 billion to states and districts to support COVID catch-up, and we know millions of students have fallen behind. Yet, we are manifestly unprepared to manage the academic recovery. State test scores from next spring will arrive too late to make a difference. And schools will be implementing new programs—such as tutoring or vacation academies or summer school—that they are not accustomed to tracking. Now more than ever, policymakers and parents will be anxious to know whether students are on track to make up for the ground they lost. School districts will need evidence with which to adjust their programs and strategies. Unfortunately, state data systems will not provide the evidence that districts will need to manage the catch-up and to adjust their strategies.

 

Re-Assessing Three Decades of U.S. Education Reform (2021-2023)
Walton Family Foundation

Critics and proponents of education reform efforts over the last 30years have created a narrative thatthere's been no progress and that achievement gaps are widening. Of course, their motivations are different. Reformers have found it expedient to lament the pace of improvement and widening gaps to make the case for more aggressive action. Their opponents have been making the same complaint, only to argue that past policies sudi as test based accountability are ineffective. Over time the cumulative effect has been to undermine the willingness of policymakers to take the inevitable political risks required for reform. The collective sense of futility both sides have created has itself become an important barrier to continued efforts. But has progress actually been slow and have achievement gaps truly continued to widen? In this project we wi II take a fresh look atthe state-by-state gains that have made on the National Assessment of Educational Progress, the long-term outcomes (e.g. earnings, educational attainment etc.) for those born in states with the largest increases, evidence on income-based gaps in achievement and the role of state policy (such as test-based accountabili ty or school finance reform) in driving the NAEP trends. To the extent that we can provide a more accurate picture where progress has been made- and not made-and the role of state policy in driving those changes, we believe the project will have important impacts on the ·atmospherics" of the education policy debate, both at the state level and atthe federal level.

 

For research on the effectiveness of COVID-19 educational recovery efforts (2021-2022)
Carnegie Corporation of New York

Given the learning losses that U.S. students have experienced during the pandemic, the 2021-22 school year may be the most important year in our nation’s history. The federal government has provided nearly $200 billion to states and districts to support COVID catch-up, and we know millions of students have fallen behind. Yet, we are manifestly unprepared to manage the academic recovery. State test scores from next spring will arrive too late to make a difference. And schools will be implementing new programs—such as tutoring or vacation academies or summer school—that they are not accustomed to tracking. Now more than ever, policymakers and parents will be anxious to know whether students are on track to make up for the ground they lost. School districts will need evidence with which to adjust their programs and strategies. Unfortunately, state data systems will not provide the evidence that districts will need to manage the catch-up and to adjust their strategies.The United States cannot rely on business-as-usual reporting requirements to provide the evidence we will need to manage the COVID catch-up next year. We (The Center for Education Policy Research at Harvard University, along with partners AIR and NWEA) are requesting funds for 12 months to support addressing the current data gaps and providing partner districts and the field as a whole with timely information on the effectiveness of covid catch-up strategies. The primary goal of the project is to drive state and federal dollars toward the most effective strategies for helping students catch up. Such evidence will leverage the $200 billion in federal dollars, which will be obligated by fall 2022. Through this project we will be providing the evidence, not available anywhere else, to both maximize the impact of the federal dollars and inform and encourage districts and states to allocate the dollars toward the most effective catch-up strategies. We also hope this work will lead to continued partnerships in the future, whereby districts see the value of collecting data on interventions they are implementing which would dramatically lower the cost of evaluating such interventions.

 

U.S Federal Agency Education Data Fellowship (2020-2022)
Bill and Melinda Gates Foundation

This investment would support the USP Data Strategy BOW outcome to data systems (Modernize data systems across the education to workforce ecosystem to ensure the necessary infrastructure is available to readily connect supply and demand from stakeholders) and to governance (Strengthen governance within and across data systems to enable a catalytic release of data by lowering impedance cost to access while rigorously protecting privacy).This investment would also support the K-12 PST Data Infrastructure Primary Outcome of, "POs and field have access to richer data sets aligned to our strategy". Depending on the research questions and data of focus, this investment stands to support and accelerate the work of the K-12 BOWs NSIs, Charter Seat Growth, Charter SWD, PS Pathways, Teacher Prep, SEL, Middle Years Math, Writing, and R&D Infrastructure BOWs.Its success would be measured by the increased availability of research data related to educational progress over time and increased use of these data in research projects.

 

COVID-19 Development of Tools to Reduce Chronic Absenteeism (2020-2022)
Bill and Melinda Gates Foundation

The primary outcome of this grant would be the development of a much more self-guided option to making use of Proving Ground's approach, expertise, and tools, to significantly increase the number of partners they are able to work with. School systems engaging in this modified approach will either use it as a lower-cost entry point into the Proving Ground Network, or a method of maintaining fidelity to the approach after their full service contract with Proving Ground has ended.

 

PIER2 : Partnering in Education Research (PIER) An Interdisciplinary Pre-doctoral Training Program (2020-2025)
IES

Because of the rich data they control, state and local education agencies are alluring partners for education researchers. Yet all researchers—especially young ones—must be mindful of the potential hazards: how to maintain objectivity, how to deliver bad news, how to negotiate a data use agreement and respect student privacy, how to convert an agency leader’s concern into a tractable research question. By renewing the PIER fellowship, we hope to train the next generation of education scholars to navigate such hazards and produce rigorous research which informs practitioner decisions. We will focus on the content area of Improving Education Systems and in the methodological area of Efficacy. Our proposed program includes ten core elements: (i) the coordinated curriculum requirements, (ii) the public seminar with private “director’s cut” seminar, (iii) the PIER proseminar course, (iv) the PIER faculty mentor, (v) the research apprenticeship, (vi) the policy internship, (vii) the PIER Summit, (viii) the PIER student seminar, (ix) independent research, and (x) job market support—are together designed to provide students with a mix of hard skills, soft skills, and relationships that are required for successful researcher-practitioner partnerships in education.Aside from our track record of training and placing many of the top young quantitative education researchers in the field today, our proposal has two primary strengths: (1) Through our Strategic Data Project, we have built a national network of agencies, having trained more than 350 data analysts in more than 125 state and local education agencies and non-profits. Moreover, with our recent IES-funded National Center for Rural Education Research Networks, that network now includes 50+ rural school districts in NY and OH. The size and breadth of the network allows us to accommodate a wide range of student interests as well as to foster direct relationships between fellows and agency leaders that are not dependent on a faculty member’s project. Fellows take those relationships with them as they start their careers. (2) In our first training grant in 2015, we included a policy internship, placing fellows in 10-week summer residencies in a partner agency. Thus, rather than starting from scratch, we are building on that experience and improving an existing policy internship: extending it to a yearlong engagement and adding a faculty coordinator of PIER partnerships (Carrie Conaway) to oversee student placements and partner engagement.Principal Investigator, Tom Kane, has extensive experience collaborating with school agencies, including leading the $60 million Measures of Effective Teaching project. Our twenty-six faculty affiliates are drawn from three schools at Harvard (HGSE, HKS, and GSAS) and two more are close collaborators, drawn from other Boston-area institutions. Number of PIER Fellows Trained: Approximately 30 doctoral students will receive two- or three-year PIER fellowships. (The exact number depends on cohort size across each of the three years and when students apply during their graduate career). Across all cohorts, the PIER fellowship will support 68 fellowship years.

 

Validation Network (2019-2023)
Bill and Melinda Gates Foundation

Over the last decade, public agencies have invested heavily in testing and in longitudinal data systems which track the outcomes of individual students over time and connect students to teachers and programs. (The teacher-student links are critical, since so many interventions are delivered through the actions, skills and expertise of teachers.) Despite this investment, however, the federal Institute of Education Sciences and philanthropies, such as BMGF, still generally use a traditional model of one-off, customized evaluations. This is inefficient and unnecessarily costly. For each new study, the data must be cleaned and assembled, data use agreements negotiated, and a team of researchers spends months testing different statistical models—but after the report is written, all that work is discarded. Just as NASA transitioned from single-use spacecraft to the multi-use Space Shuttle in the 1980’s, we seek to build a standing network of districts and charter schools, with data use agreements and analysis files already in place, which can be used to evaluate multiple interventions over multiple years. Such a network would dramatically lower the incremental cost of each new study.Moreover, given the decentralized nature of decision-making in U.S. education, it is necessary to engage a broader network of educators in the excitement of discovery: proposing solutions, choosing interventions to test, and in piloting and evaluating them. The time to build an audience is before an intervention is launched, rather than after the results are released. Too often, educators have been viewed as passive recipients of “evidence” after the fact. No wonder educators are not eager to change what they are doing in response to the latest research! They are truly an afterthought.Our long-term vision is to build a mechanism for engaging a broader audience from the start-- soliciting the feedback of teachers, principals and district/network leaders to establish design parameters for interventions to be developed, recruiting teachers and schools to participate in pilots for specific interventions, keeping the network informed of pilots underway, and sharing results as soon as the pilots are completed. To do that, we will also build a mechanism for collecting data on additional outcomes and on implementation challenges, beyond test scores alone. To provide that, the network will develop a platform for surveying participating teachers and students in treatment and comparison classrooms. Once the infrastructure is in place, we believe that the field will be able to make much more rapid progress in determining what works at a fraction of current costs. As importantly, such an infrastructure will make it easier for those developing interventions (either the developers themselves or those who fund them) to evaluate efficacy in a reasonable timeframe and incorporate feedback from the field. Finally, schools, teachers, and students – the “end users” – will, over time, be faced with an array of tested products rather than unfounded claims.Although the model will require philanthropic support to start, we believe that it could be self-sustaining once it is up, running, and at scale. We expect that intervention funders and developers will be willing to pay the costs of validation network for two reasons: first, the costs (both in time and money) will be substantially below what rigorous evaluations currently cost and second, partly as a result of the increase of information available due to lower costs of getting an evaluation done, schools and districts will be more willing to demand validated solutions.

 

Testing the Role of Schools in Intergenerational Mobility: The Long-term Impact of School Boundary Changes (2019-2022)
Walton Family Foundation

The project plan to explore the feasibility of using two "natural experiments" to test the role of schools in intergenerational mobility. To be used in conjunction with the long-term income data used by the Raj Chetty team at Opportunity Insights, such “experiments” need to fit three criteria:1.The affected students need to be “linkable” to the tax data. One way to do this would be to use the address on parent’s tax returns during a student’s school-age years. If we know that a policy or intervention affected a particular neighborhood or school catchment area or school district, Chetty and his team can identify the families with children who were living in that geographical area at the time.2.Relevant birth cohorts:The interventions should have affected the 1978 to 1990 birth cohorts (e.g. roughly those who graduated from high school between 1996 and 2008). In the “Opportunity Atlas” they focused on the 1978 to 1983 birth cohorts, but they should be able to go a few years later.3.Large and convincingly exogenous: We need to be able to tell a story that the intervention was large enough to have a detectable effect on earnings and was not subject to selection bias.

 

National Center on Rural Education Research Networks (2019-2024)
IES

The Center for Education Policy Research (CEPR) is proposing to launch the National Center for Rural Education Research Networks (NCRERN) under the topic of National Research and Development Centers. Over the last 13 years, CEPR has been a leader in building the data capacity of school agencies, having recruited and trained 250+ analysts working in more than 100 SEAs and LEAs. For the last 30 months, we have worked with a network of urban districts and CMOs to design and conduct pilots for educational software and chronic absenteeism through our Proving Ground initiative. We plan to draw from that experience to launch a new network for rural schools. NCRERN proposes to support a network of 60 rural school districts in NY and OH, and to work with 3 replication states (IA, NM, and WY) to address challenges in three key content areas: chronic absenteeism, college readiness (especially Advanced Placement courses (AP)), and college enrollment. The focused program of research will consists of four distinct types of studies: The historical analyses will explore trends and differences in outcomes by schools and subgroup, with the goal of generating hypotheses regarding possible solutions. We will also do predictive analyses to identify students most likely to benefit from targeted interventions (e.g., those at risk for chronic absenteeism, those best prepared for AP success, or those needing additional support to enroll in college). The second type of analysis will consist of carefully designed causal analyses of interventions chosen by our partners in each of the three topic areas. These studies will use randomized and quasi-randomized designs to monitor implementation, evaluate immediate impacts, and refine the intervention and improve implementation on a rapid-cycle basis. We will support the partners to monitor their implementation and refine the interventions over time. The third study will be a replication evaluation in which SEAs in the replication states (IA, NM and WY) will present to their rural districts the subset of interventions which proved effective in NY and OH. Finally, we will conduct an overall evaluation of the impact of participation in NCRERN, by comparing the change in outcomes in the participating districts and the comparison districts on a wide range of student outcomes.NCRERN will be led by Co-PIs, Thomas Kane (HGSE), Douglas Staiger (Dartmouth), and Christopher Avery (HKS). We will use regional experts, with long experience working with rural schools, to serve as network facilitators: John Sipple (Cornell) in New York and Mike Fuller (Muskingum Valley Educational Service Center) in Ohio. We will host regular in-person regional meetings and an annual convening for NCRERN partners. A Center Director will oversee the day-to-day activities and overall work of the center, and a State Network Manager will be placed in both New York and Ohio to manage each stateÂ’s network of rural districts. Additionally, a Research Manager will lead the full analytics team.We believe our proposal has several distinctive strengths: our history of building district capacity through our Strategic Data Project; our recent experience running a continuous improvement network through our Proving Ground project; the combined expertise of our project team in rural education, research methods and change management; the iterative approach to piloting and testing interventions that we envision; the feasibility of the overall evaluation to yield convincing impact estimates of our model; and (if successful) our connection with state regional organizations and state ESSA plans will simplify replication.

 

Proving Ground: A Network for Continuous Improvement Multi-Day Institute (2018-2022)
Carnegie Corporation of New York

Proving Ground is designed to solve three critical challenges:First, the traditional model of impact evaluation in education is too costly and timeconsuming to support district decision-making. The typical federally funded impact study costs more than $3 million and requires more than 3 years to complete.(1) Few districts have the time or resources to undertake such an expense for each of their majorinitiatives. For example, while federal, state and local taxpayers have supported the development of impressive data systems for measuring student outcomes and tracking them over time, school agencies still lack the analytic capacity to use those data to evaluate programs in a timely and efficient way.Second, district leaders rarely have the time to think about the root causes of problems. Instead, they apply solutions without taking the time to understand the problem. Moreover, like any large bureaucracy, they have difficulty bridging organizational siloes to field a coordinated response to the challenges they faceThird, when it comes to learning about efficacy, school districts are almost always on their own. They lack venues for pooling data, comparing notes and evaluating initiatives together.As a result, school districts are constantly cycling through new initiatives, with no valid way to know which are working and which are not, and no organizational framework for building consensus. A “solution” can become popular and spread without any evidence of efficacy. Similar to the medical field in the decades before the advent of clinical trials,U.S. education has lacked a widespread mechanism for testing its ideas and building consensus around what works. As a result, home remedies abound, bad ideas linger, and effective interventions do not spread.In short, the evidence-making process is not well aligned with the decision-making process in education.

 

The Long-Run Influence of Charter Schools – Phase I (2018-2024)
Laura and John Arnold Foundation

The Long-Run Influence of Charter Schools (phase I) project hopes to create a charter school lottery data clearinghouse, to make it possible for other researchers to store data, propose projects and to gain secure access to the data for their own research projects. The first phase of the project (the phase covered by this grant) proposes a large-scale data collection effort in a handful of urban centers and will focus on two goals.• Assembling of admission lottery data from past cohorts of charter school applicants in order to estimate impacts on long-term outcomes-- such as earnings, college attendance and home ownership (all based on tax records). • Assembling of admission lottery results from current cohorts of charter school applicants before those data are lost or destroyed in order to lay the groundwork to estimate impacts on both medium-term outcomes (e.g., student achievement and college-going rates) and long-term outcomes (e.g., wages) in the future. Three categories of variables will be used to facilitate the analysis of long-term outcomes from the charter schools:• Linking variables (e.g. student name, gender, parent name, parent address, student date of birth, student grade level, the school a student is currently attending)• Lottery outcome variables (e.g. the randomized ordering used, any priority categories, whether the student was initially offered a slot, and whether the student was eventually offered a slot at a charter school as the school worked through the waiting list) and • whether the student enrolled in the charter school in subsequent school years. In phase two of the project (not included in this grant) hopes to merge the data onto IRS records.

 

Partnering in Education Research (PIER): A Predoctoral Interdisciplinary Training Program (2015-2022)
IES

Over the past decade, districts, states and the federal government have invested heavily in education data systems. Those data are a vital national resource, with the potential to illuminate many longstanding debates in education. Yet the data remain vastly underutilized by the education research community. Through the Partnering in Education Research (PIER) fellowship, we will train the next generation of education scholars to collaborate with school agencies to put the data to use. To succeed, they will need more than the quantitative methodological skills taught in graduate programs. We have designed the program to provide the other soft and hard skills they will need to work successfully with school agencies: translating practitioners’ questions into tractable research questions, understanding when and where randomization and quasi-experimental designs are feasible, negotiating the fine points of a data security agreement, preserving relationships while reporting results with rigor and objectivity.While on campus, the fellows will work with a community of prominent social scientists from across the Harvard campus (specifically, the Harvard Graduate School of Education (HGSE), the Harvard Graduate School of Arts and Science (GSAS), and the Harvard Kennedy School (HKS)). In addition, the fellows will have access to the unique network of more than 60 school agencies around the country with whom the Center for Education Policy Research at Harvard currently works. Each fellow will:• Serve as a research apprentice on education projects for a PIER core faculty member;• Be placed in an agency internship with one of our partner education agencies, during which they will design and execute a research project in collaboration with agency leadership;• Attend an annual Partnership Discovery Event connecting researchers with education leaders to design research projects on particular themes, such as “blended learning” or “teacher evaluation”; • Take part in a pro-seminar in education research. The seminar series will include “Director’s Cut” sessions given by leading researchers, in which they will describe how their research project came about and the challenges overcome along the way.Approximately 28 doctoral students will receive two- or three-year PIER fellowships. Across all cohorts, the PIER fellowship will support 68 fellowship years.

Publications

Thomas J. Kane, David Blazar, Hunter Gehlbach, Miriam Greenberg, David Quinn, Daniel Thal. "Can Video Technology Improve Teacher Evaluations?: An Experimental Study" Education Finance and Policy, Vol. 15, No. 3, pp. 397-427.,(2020)

David Blazar, Blake Heller, Thomas J. Kane, Morgan Polikoff, Douglas Staiger, Steve Carrell and Michal Kurlaender. "Curriculum Reform in the Common Core Era: Evaluating Elementary Math Textbooks Across Six U.S. States" Journal of Policy Analysis and Management, Vol. 39, No. 4, pp. 966-1019.,(2020)

Mark Chin, Thomas J. Kane, Whitney Kozakowski, Beth E. Schueler, Douglas 0. Staiger. "School District Reform in Newark: Within- and Between-School Changes in Achievement Growth" Industrial and Labor Relations Review, Vol. 72, No. 2, pp. 323-354.,(2018)

Thomas J. Kane, "Develop and Validate--then Scale: Lessons from the Gates Foundation's Effective Teaching Strategy." Education Next, October 15, 2018.,(2018)

Thomas J. Kane, "Making Evidence Locally: Rethinking Education Research under the Every Student Succeeds Act" Education Next, Vol. 17, No. 2, pp. 52-58.,(2017)

Thomas J. Kane, "Shooting Bottle Rockets at the Moon: Overcoming the Legacy of Incremental Education Reform" Brookings Chalkboard May 29, 2014.,(2014)

Thomas J. Kane, Kerri Kerr and Robert Pianta (eds.) Designing Teacher Evaluation Systems: New Guidance from the Measures of Effective Teaching Project (Jossey-Bass). (Kane was the PI.),(2014)

David J. Deming, Justine S. Hastings, Thomas J. Kane, Douglas O. Staiger. "School Choice, School Quality and Postsecondary Attainment" American Economic Review Vol. 104, No. 3, pp. 991-1013.,(2014)

Thomas J. Kane, Daniel F. McCaffrey, Trey Miller and Douglas 0. Staiger, "Have We Identified Effective Teachers? Validating Measures of Effective Teaching Using Random Assignment" (Seattle, WA: Bill & Melinda Gates Foundation). (Kane was the PI.),(2013)

Jonah Rockoff, Douglas Staiger, Thomas J. Kane and Eric Taylor. "Information and Employee Evaluation: Evidence from a Randomized Intervention in Public Schools" American Economic Review Vol. 102, No. 7, pp. 3184-3213.,(2012)

Atila Abdulkadiroglu, Josh Angrist, Susan Dynarski, Thomas J. Kane & Parag Pathak. "Accountability and flexibility in public schools: Evidence from boston's charters and pilots" Quarterly Journal of Economics, Vol. 126, No. 2, pp. 699–748.,(2011)

Thomas J. Kane and Douglas 0. Staiger, "Volatility in School Test Scores: Implications for Test-Based Accountability Systems" in Diane Ravitch (ed.) Brookings Papers on Education Policy (Washington, DC: Brookings Institution, pp. 235-283).,(2002)

Thomas J. Kane and Douglas 0. Staiger. "The Promise and Pitfalls of Using Imprecise School Accountability Measures" Journal of Economic Perspectives, Vol. 16, No. 4, pp. 91-114.,(2002)

Thomas J. Kane, The Price of Admission: Rethinking How Americans Pay for College (Washington, DC: Brookings Institution).,(1999)

Thomas J. Kane and Cecilia Rouse. "Labor Market Returns to Two-Year and Four-Year College" American Economic Review, Vol. 85, No. 3, pp. 600-614.,(1995)

News Stories