Thomas Kane is an economist and Walter H. Gale Professor of Education at the Harvard Graduate School of Education. He is faculty director of the Center for Education Policy Research, a university-wide research center that works with school districts and state agencies. Between 2009 and 2012, he directed the Measures of Effective Teaching project for the Bill & Melinda Gates Foundation. His work has spanned both K-12 and higher education, covering topics such as the design of school accountability systems, teacher recruitment and retention, financial aid for college, race-conscious college admissions and the earnings impacts of community colleges. From 1995 to 1996, Kane served as the senior economist for labor, education, and welfare policy issues within President Clinton's Council of Economic Advisers. From 1991 through 2000, he was a faculty member at the Kennedy School of Government. Kane has also been a professor of public policy at UCLA and has held visiting fellowships at the Brookings Institution and the Hoover Institution at Stanford University.
Click here to see a full list of Thomas Kane's courses.
Professor of the Year, UCLA School of Public Affairs, Public Policy Department,(2006)
In education most new ideas will fail. The process by which children and adults learn is just too complex to anticipate every obstacle. Ultimately, the rate of progress of U.S. education will not be determined by the quality of the initial reform ideas, but by the speed with which we can identify the subset of those ideas which are effective. We believe that a primary reason for our slow progress is our inability to learn what works in a timely and efficient manner and to engage state and local leaders in that learning process.
The Center for Education Policy Research at Harvard University, is working with a network of school agencies to launch a system of low-cost, rapid cycle, ongoing trials. The work is guided by seven design principles:
o State and local decision makers not other researchers, nor the federal government must be the primary audience.
o The evidence must illuminate interim outcomes and implementation and not simply provide a summative verdict on effectiveness after the fact.
o Whenever feasible, solutions should be piloted in a subset of schools and classrooms and rolled out broadly only when the early results are promising.
o The reporting cycle must be less than one year.
o To keep costs low, the system must rely primarily on existing data (such as student achievement, grade retention, attendance, graduation, college going and student log files from educational software providers).
o We focus on policies and services which could plausibly be scaled up quickly.
o Reports are public and distributed to network leadership.
Over the past decade, districts, states and the federal government have invested heavily in education data systems. Those data are a vital national resource, with the potential to illuminate many longstanding debates in education. Yet the data remain vastly underutilized by the education research community. Through the Partnering in Education Research (PIER) fellowship, we will train the next generation of education scholars to collaborate with school agencies to put the data to use. To succeed, they will need more than the quantitative methodological skills taught in graduate programs. We have designed the program to provide the other soft and hard skills they will need to work successfully with school agencies: translating practitioners questions into tractable research questions, understanding when and where randomization and quasi-experimental designs are feasible, negotiating the fine points of a data security agreement, preserving relationships while reporting results with rigor and objectivity.
While on campus, the fellows will work with a community of prominent social scientists from across the Harvard campus (specifically, the Harvard Graduate School of Education (HGSE), the Harvard Graduate School of Arts and Science (GSAS), and the Harvard Kennedy School (HKS)). In addition, the fellows will have access to the unique network of more than 60 school agencies around the country with whom the Center for Education Policy Research at Harvard currently works. Each fellow will:
Serve as a research apprentice on education projects for a PIER core faculty member;
Be placed in an agency internship with one of our partner education agencies, during which they will design and execute a research project in collaboration with agency leadership;
Attend an annual Partnership Discovery Event connecting researchers with education leaders to design research projects on particular themes, such as blended learning or teacher evaluation;
Take part in a pro-seminar in education research. The seminar series will include Directors Cut sessions given by leading researchers, in which they will describe how their research project came about and the challenges overcome along the way.
Approximately 28 doctoral students will receive two- or three-year PIER fellowships. Across all cohorts, the PIER fellowship will support 68 fellowship years.
The purpose of this project is to perform the evaluation portion of a randomized control trial designed to assess the impact of The Match School Foundation, Inc.s teacher training program for novice teachers on outcomes such as achievement growth of teachers' students, principal ratings, and retention with the ultimate goal of improving K-12 education.
The Center for Education Policy Research (CEPR) has partnered with The Match School Foundation, Inc. ("Match" ) in order to evaluate the efficacy of Match's teacher training program. CEPR runs and evaluates the Match Teacher Launch Project ("TLP" ), a training program for first-year teachers. TLP uses the Match Teacher Residency ("MTR") methodology and coursework to better prepare more effective first-year teachers. CEPR and Match target "regular" pre-service teachers, or those receiving a degree from a traditional education school ("TLP Participants"), to receive this distilled version of MTR programming. Over the course of two months, TLP Participants take three courses that are each thirty to forty hours long ("Summer Training' ), and receive additional coaching every week for the first four months of teaching ("Fall Coaching"). TLP is designed as a randomized controlled trial, piloted in 2015 ("Year 1"), and fully run in 2016 and 2017. CEPR leads the research and evaluation effort, and measures effectiveness via outcomes including measures of TLP Participant achievement, attitudes, beliefs, practices, and job satisfaction, principal ratings, and overall retention.
Rather than attempt to make a summative statement about the success of Newark school reform, this research project focuses on key questions at the core of the theory of action in Newark. If we can shed light on those, we can both inform school reform efforts around the country and provide feedback to Newark leadership to guide modifications to the plan.
Specifically, the research team studies the following four questions:
1. What is the effect of a school becoming a renewal school" on individual students trajectories?
2. Has Newark been having the right turnover i.e., have less effective teachers been leaving at higher rates than more effective teachers?
3. How has the equity of access to quality seats changed in Newark since the reforms began?
4. What is the impact of winning the student placement lottery on student outcomes (for both oversubscribed charter schools and public schools)?
5. What has been the early impact of the reforms on students longer term outcomes (e.g., college trajectories)?
The Center for Education Policy Research - through the One Penny project - is investigating the development of a new organization focused on the creation, coordination, and dissemination of evidence-based education research in Massachusetts. The purpose of this grant is to help CEPR engage in more in-depth strategic planning for the One Penny project.
A Discovery phase includes interviews with thought leaders in the space, development of case studies from other sectors, framing of potential organizational models, and conducting an economic impact analysis. A Plan Development phase includes the drafting of a strategic plan that includes a vision, goals, activities, organizational structure, financial plan and implementation plan for the organization. The final deliverable is a written strategic plan for the proposed organization.
This project evaluates math options in Tennessee for high school seniors whose previous achievement in math would require them to take developmental math in college. Chief among these options is SAILS, a computer-enabled developmental math program for high school seniors. The program was developed in 2012 by Chattanooga State Community College and is now being scaled across the state. Students are assigned to SAILS based on an ACT math cut score, and those who successfully complete the programs modules are not required to take developmental math in community college and can enroll in credit-bearing college math courses immediately. The evaluation examines whether the SAILS model improves student achievement in math and results in better long-term postsecondary outcomes than other approaches to delivering remediation currently used in Tennessee. It also compares the short- and long-run costs and savings associated with SAILS to those of other approaches and uses qualitative methods to identify successful implementation practices for SAILS.
During the summer of 2015, parents and teachers around the country will be receiving bad news. Millions of children, previously believed to be proficient in math and English, will fail to meet the new Common Core standards. Recriminations will be directed at state departments of education for not providing sufficient curriculum materials, at district leaders for not preparing students and teachers adequately, at testing contractors for logistical snafus and at federal bureaucrats for interfering with state standard-setting.
Heres a plan to put a hopeful message in a bottle for the summer of 2015:
America Achieves and the Center for Education Policy Research (CEPR) would work with the America Achieves to design a teacher survey asking about the Common Core implementation identified treatments and other strategies (e.g., instructional coaching using digital video captured by coached teachers, a specific textbook or curriculum, supplemental programs for students, etc.) their school is using, and a principal survey asking about similar topics. Student-level analysis file with the teacher survey responses regarding the implementation strategies.
Along with the Message in a Bottle project proposed above, CEPR can provide support and coaching to state fellows on the organizational capacity and data infrastructure needed to allow evidence to drive policy and support districts and schools.
Scope of Work: To provide evaluation, management, and analysis services to the Massachusetts Department of Elementary and Secondary Education (MA DESE). The Center for Education Policy Research is providing these services as part of its Strategic Data Project (SDP) work, an initiative at the Center for Education Policy Research at Harvard University (CEPR) funded in part by the Bill & Melinda Gates Foundation. The work is a continuation of the analytic services Harvard University has provided MA DESE over the past twenty months through the SDP Fellows program.
Findings from the Measures of Effective Teaching project have shown that classroom observations are valid predictors of teacher effectiveness. However, implementing classroom observations with rigor and at scale is challenging for even the most resourceful schools and district leaders. The Best Foot Forward Project aims to learn whether school districts could use video technology to make the observation process easier to implement, less costly, more acceptable to teachers, and more valid and reliable.
The Best Foot Forward project measures the impact of using digital video for classroom observations and for promoting teacher growth. In the 2013-2014 school year, with 200 teachers (in 50 schools) in the treatment group and an equal number in a control group participating, we examine whether digital video technology can improve teaching practice and student outcomes in treatment classrooms; whether it is preferred by both teachers and principals to in-person observations; and whether it presents a cost-effective, scalable alternative to in-person observations.
The Center for Education Policy Research (CEPR) is working with the College Board to measure students access to rigorous work worth doing." In high school, that means that the students with general preparation in mathematics and literacy are taking rigorous coursework, such as advanced science or mathematics required by colleges, AP courses or IB programs. In senior year of high school, that means that qualified students are applying to college. In the transition from high school to college, that means that students are enrolling in colleges which are at least at their level of academic preparation (or higher). Key areas of investigation include:
1. Working with the College Board to develop a measurement strategy (that is, identifying specific ways to measure students' exposure to rigorous work worth doing and describing ways in which the data could be acquired and assembled)and
2. Working with the College Board to pilot-test interventions which increase student exposure to rigorous work.
The focus is particularly on improving outcomes for the most disadvantaged students in the US.
Topic: Education Research and Development Centers: Teacher EffectivenessFocus: Mathematics Instruction, Grades 4 through 6Overview: The Harvard Graduate School of Education (HGSE) is well-placed to combine the disparate strands of research and accellerate the search for a valid, scalable measure of teacher effectiveness. The proposed center will have four primary goals: 1. Unify three disparate strands of research: value-added modelling, the direct study of math instruction, and non-instructional predictors of teacher effectiveness (e.g. assessments of teacher knowledge, student evaluations). The team of investigators includes leaders from all three areas.2. Develop an empirically-derived composite measure of teacher effectiveness, which efficiently combines estimated impacts on student achievement, ratings of pedagogical practice on both general and content-specific observation protocols, tests of teachers mathematical knowledge and student evaluations. 3. Operationalize the measures to ensure that they are usable in the field, with training videos for principals, test items for teacher assessments, student evaluation forms, etc. 4. Externally validate the estimated differences in teacher effectiveness by randomly assigning classroom rosters within teams of teachers working in the same grades and schools, comparing their students achievement gains. Partner Districts: District of Columbia and Charlotte-Mecklenburg Public SchoolsIdentification Study: Over two years, we will administer a student assessment in the fall and spring; collect 4 video observations per teacher in the study each year; rate those videos using both general and content-specific rubrics; assess teachers with the test of Mathematical Knowledge for Teaching; administer student evaluations. Using a novel empirical approach, we will form an Empirical bayes composite measure, efficiently combining all of the above in a manner which best predicts student achievement gains in a non-experimental setting.Validation Study: In a third year of data collection, we will randomly assign classrooms of students to teachers working in the same grades and schools. We will repeat the video collection and ask principals, using the training materials provided by the research team, to rate their teachers instruction. Using those videos, we will confirm that the principals ratings of instruction were consistent with the researcher ratings. We will also test whether the predicted differences in student achievement using the composite measures were borne out after random assignment. If not, we will use the student outcomes following random assignment to recalibrate the composite measure.
Higher education is a vital part of American society. It is therefore not surprising that economists are interested in studying the financing of college and the factors that impact student success and graduation. However, the economics of higher education is significantly limited by the availability of relevant data easily accessible and usable by investigators. Among the untapped resources are administrative databases unit record data on students and institutions that may be available at the state, university, or school level. They hold the promise of helping researchers to answer some of the most pressing questions related to education. Unfortunately, due to significant upfront costs to secure access, encode it to protect student confidentiality, and structure the data for research purposes and, it is difficult for researchers to fully utilize these potential sources of information. Mirroring the issues faced by academic researchers, states, university systems, and districts also have limited capacity to perform research that would help them understand the problems and issues facing their stakeholders.
Thomas J. Kane, Let the Numbers Have Their Say: Evidence on Massachusetts' Charter Schools. Center for Education Policy Research at Harvard University, 2016. http://cepr.harvard.edu/files/cepr/files/evidence-ma-charter-schools.pdf,(2016)
Thomas J. Kane, The cost of the charter school cap: Evidence shows low-income, urban students pay the price CommonWealth Magazine, October 5, 2016. http://commonwealthmagazine.org/education/the-cost-of-the-charter-school...,(2016)
David J. Deming, Justine S. Hastings, Thomas J. Kane, Douglas O. Staiger (2014) School Choice, School Quality and Postsecondary Attainment American Economic Review Vol. 104, No. 3, pp. 991-1013,(2014)
Thomas J. Kane, Daniel F. McCaffrey, Trey Miller and Douglas O. Staiger, Have We Identified Effective Teachers? Validating Measures of Effective Teaching Using Random Assignment (Seattle, WA: Bill & Melinda Gates Foundation, 2013).,(2013)
Jonah Rockoff, Douglas Staiger, Thomas J. Kane and Eric Taylor (2012) Information and Employee Evaluation: Evidence from a Randomized Intervention in Public Schools American Economic Review Vol. 102, No. 7, pp. 3184-3213.,(2012)
Thomas J. Kane, A Layered Approach to Teacher Assessment Education Next Summer 2012 Vol. 12, No. 3.,(2012)
John P. Papay, Martin R. West, Jon B. Fullerton, Thomas J. Kane (2012) Does an Urban Teacher Residency Increase Student Achievement? Early Evidence From Boston Educational Evaluation and Policy Analysis. Vol. 34, No. 4: pp. 413-434.,(2012)
Joshua Angrist, Susan Dynarski, Thomas Kane, Parag Pathak, Christopher Walters (2012) Who Benefits from KIPP? Journal of Policy Analysis and Management Vol. 31, No. 4, pp. 837-860.,(2012)
Thomas J. Kane and Douglas O. Staiger Gathering Feedback on Teaching: Combining High-Quality Observations with Student Surveys and Achievement Gains (Seattle, WA: Bill & Melinda Gates Foundation, 2012),(2012)
Thomas J. Kane, Amy L. Wooten, Eric S. Taylor and John H. Tyler, Evaluating Teacher Effectiveness: Can classroom observations identify practices that raise achievement? Education Next, Summer 2011, Vol. 11, No. 3.,(2011)
Brian Jacob, Thomas J. Kane, Jonah Rockoff and Douglas Staiger, Can You Recognize an Effective Teacher When You Recruit One? Education Finance and Policy, vol. 6, no. 1, Winter 2011, pp. 43-74.,(2011)
Atila Abdulkadiroglu, Josh Angrist, Susan Dynarski, Thomas J. Kane & Parag Pathak (2011). Accountability and flexibility in public schools: Evidence from boston's charters and pilots Quarterly Journal of Economics, 126(2), 699-748.,(2011)
Thomas J. Kane, Eric Taylor, John Tyler and Amy Wooten (2011). Identifying effective classroom practices using student achievement data Journal of Human Resources, 46(3), 587-613.,(2011)
Joshua Angrist, Susan Dynarski, Thomas J. Kane, Parag Pathak, Christopher Walters. Inputs and Impacts in Charter Schools: KIPP Lynn American Economic Review, vol. 100, no. 2, May 2010, pp. 239-43.,(2010)
John Tyler, Eric Taylor, Thomas J. Kane and Amy Wooten Using Student Performance Data to Identify Effective Classroom Practices American Economic Review, vol. 100, no. 2, May 2010, pp. 256-60.,(2010)
Thomas J. Kane, Jonah Rockoff and Douglas Staiger, What Does Certification Tell Us about Teacher Effectiveness?: Evidence from New York City Economics of Education Review (2008), Vol. 27, No. 6, pp. 615-31.,(2008)
Thomas J. Kane, Evaluating the Impact of the DC Tuition Assistance Grant Program Journal of Human Resources, 2007.,(2007)
Thomas J. Kane, Jonah Rockoff and Douglas O. Staiger, Photo Finish: Certification Does Not Guarantee a Winner Education Next (2007) No. 1, pp. 60-67.,(2007)
Thomas J. Kane, Public Intervention in Postsecondary Education in Eric Hanushek and Finis Welch (eds.) Handbook on the Economics of Education (Amsterdam: Elsevier/North Holland, 2006).,(2006)
Justine S. Hastings, Thomas J. Kane and Douglas O. Staiger, Gender and Performance: Evidence from School Assignment by Randomized Lottery American Economic Review, Vol. 96, No. 2, May 2006, pp. 232-36.,(2006)
Thomas J. Kane, Stephanie Riegg Cellini, Douglas O. Staiger, School Quality, Neighborhoods and Housing Prices American Law and Economics Review (2006) Vol. 8, pp. 183-212. (Lead article in special issue on Brown v. Board of Education 50th anniversary.),(2006)
What Does Certification Tell Us about Teacher Effectiveness? Evidence from New York City" (with J.E. Rockoff and D.O. Staiger),(2006)
Robert Gordon, Thomas J. Kane and Douglas O. Staiger, Identifying Effective Teachers Using Performance on the Job Hamilton Project Discussion Paper, Published by the Brookings Institution, March 2006.,(2006)
Thomas J. Kane, Peter R. Orszag and Emil Apostolov Higher Education Appropriations and Public Universities: The Role of Medicaid and the Business Cycle Brookings-Wharton Papers on Urban Affairs, 2005 pp. 99-127.,(2005)
School Quality, Neighborhoods and Housing Prices: The Impacts of School Desegregation, NBER Working Paper No. 11347, (with D.O. Staiger and S.K. Riegg),(2005)
Thomas J. Kane and Christopher Avery Student Perceptions of College Opportunities: The Boston COACH Program in Caroline Hoxby (ed.) College Choices: The Economics of Where to Go, When to Go, and How to Pay for It (Chicago: University of Chicago Press, 2004).,(2004)
Thomas J. Kane, College-Going and Inequality in Kathryn Neckerman (ed.) Social Inequality (New York: Russell Sage Foundation, 2004).,(2004)
Evaluating the Impact of the D.C. Tuition Assistance Grant Program, National Bureau of Economic Research Working Paper No. 10658,(2004)
Thomas J. Kane and Douglas O. Staiger, Unintended Consequences of Racial Subgroup Rules in Paul E. Peterson and Martin R. West (eds.) No Child Left Behind? The Politics and Practice of Accountability (Washington, DC: Brookings Institution Press, 2003).,(2003)
Thomas J. Kane and Peter R. Orszag Higher Education Spending: The Role of Medicaid and the Business Cycle Brookings Institution Policy Brief No. 124, September 2003.,(2003)
Thomas J. Kane, Douglas O. Staiger and Gavin Samms School Accountability Ratings and Housing Values in William Gale and Janet Pack (eds.) Brookings-Wharton Papers on Urban Affairs, 2003 (Washington, DC: Brookings Institution, 2003) pp. 83-137.,(2003)
Thomas J. Kane, The Long Road to Race-Blindness Science October 24, 2003, Vol. 302, pp. 571-573.,(2003)
Thomas J. Kane and Douglas O. Staiger The Promise and Pitfalls of Using Imprecise School Accountability Measures Journal of Economic Perspectives (Fall, 2002), Vol. 16, No. 4, pp. 91-114.,(2002)
Thomas J. Kane, Douglas Staiger and Jeffrey Geppert Randomly Accountable: Test Scores and Volatility Education Next (Spring, 2002) Vol. 2, No. 1, pp. 56-61.,(2002)
Thomas J. Kane and Douglas O. Staiger Volatility in School Test Scores: Implications for Test-Based Accountability Systems in Diane Ravitch (ed.) Brookings Papers on Education Policy, 2002 (Washington, DC: Brookings Institution, 2002).,(2002)
Thomas J. Kane Assessing the American Financial Aid System: What We Know, What We Need to Know in Maureen Devlin (ed.) Forum Futures 2001: Exploring the Future of Higher Education (Cambridge, MA: Forum for the Future of Higher Education, 2001), pp. 63-66.,(2001)
David T. Ellwood and Thomas J. Kane Who is Getting a College Education?: Family Background and the Growing Gaps in Enrollment in Sheldon Danziger and Jane Waldfogel (eds.) Securing the Future (New York: Russell Sage, 2000).,(2000)
Thomas J. Kane Rethinking How Americans Pay for College Milken Institute Review, August, 1999. (Cover story.),(1999)
Thomas J. Kane, Reforming Public Subsidies for Higher Education in Marvin Kosters (ed.) Financing College Tuition: Government Policies and Social Priorities (Washington: American Enterprise Institute, 1999).,(1999)
Thomas J. Kane, Student Aid After Tax Reform: Risks and Opportunities in Jacqueline King (ed.) Financing a College Education: How it Works, How its Changing (Phoenix: Oryx Press, 1999), pp. 137-150.,(1999)
William T. Dickens and Thomas J. Kane Racial Test Score Differences as Evidence of Reverse Discrimination: Less Than Meets The Eye Industrial Relations (1999) Vol. 38, No. 3, pp. 331-363.,(1999)
Phil Levine, Douglas O. Staiger, Thomas J. Kane and David Zimmerman "Roe v. Wade and American Fertility" American Journal of Public Health (1999) Vol. 89, No. 2, pp. 199-203.,(1999)
Thomas J. Kane. The Price of Admission: Rethinking How Americans Pay for College (Washington, DC: Brookings Institution, 1999).,(1999)
Thomas J. Kane and Cecilia E. Rouse, The Community College: Training Students at the Margin Between College and Work Journal of Economic Perspectives (1999) Vol. 13, No. 1, pp. 63-84.,(1999)
Thomas J. Kane "Savings Incentives for Higher Education" National Tax Journal (1998) Vol. 51, No. 3, pp. 609-620.,(1998)
Thomas J. Kane, Misconceptions in the Debate over Affirmative Action in College Admissions in Gary Orfield and Edward Miller (eds.) Chilling Admissions: The Affirmative Action Crisis and the Search for Alternatives (Cambridge, MA: Harvard Education Publishing Group, 1998).,(1998)
Thomas J. Kane, "Racial and Ethnic Preference in College Admissions", in Christopher Jencks and Meredith Phillips (eds.), The Black-White Test Score Gap (Washington: Brookings Institution, 1998).,(1998)
Dietmar Harhoff and Thomas J. Kane, "Financing Apprenticeship Training: Evidence from Germany" Journal of Population Economics (1997) Vol. 10, No. 2, pp. 171-196.,(1997)
Thomas J. Kane, "Postsecondary and Vocational Education: Keeping Track of the College Track" in Indicators of Children's Well-Being (New York: Russell Sage, 1997).,(1997)
Thomas J. Kane "Beyond Tax Relief: Long-term Challenges in Financing Higher Education" National Tax Journal (1997) Vol. 50, No. 2, pp. 335-349.,(1997)
Thomas J. Kane, "Lessons from the Largest School Voucher Program Ever: Two Decades of Experience with Pell Grants" in Bruce Fuller and Richard Elmore with Gary Orfield (eds.) Who Chooses? Who Loses?: Culture, Institutions and the Unequal Effects of School Choice (New York: Teachers College Press, 1996).,(1996)
Thomas J. Kane and William T. Dickens "Racial and Ethnic Preference in College Admissions" Brookings Institution Policy Brief No. 9, (November, 1996).,(1996)
Thomas J. Kane and Douglas Staiger, "Teen Motherhood and Abortion Access" Quarterly Journal of Economics (1996) Vol. 111, No. 2, pp. 467-506.,(1996)
Thomas J. Kane, "College Cost, Borrowing Constraints and the Timing of College Entry," Eastern Economic Journal (1996) Vol. 22, No. 2, pp. 181-194.,(1996)
William T. Dickens, Thomas J. Kane and Charles Schultze "Does The Bell Curve Ring True?" Brookings Review (1995) Vol. 13, No. 3., pp. 18-23.,(1995)
Thomas J. Kane and Cecilia Rouse, "Comment on W. Norton Grubb, 'The Varied Economic Returns to Postsecondary Education: New Evidence from the Class of 1972'" Journal of Human Resources (1995) Vol. 30, No. 1, pp. 205-221.,(1995)
Thomas J. Kane and Cecilia Rouse, "Labor Market Returns to Two-Year and Four-Year College" American Economic Review (1995) Vol. 85, No. 3, pp. 600-614.,(1995)
Thomas J. Kane and Mary Jo Bane "The Context for Welfare Reform" inMary Jo Bane and David T. Ellwood, Welfare Realities (Cambridge, Harvard University Press, 1994).,(1994)
Thomas J. Kane, "College Attendance By Blacks Since 1970: The Role of College Cost, Family Background and the Returns to Education" Journal of Political Economy (1994) Vol. 102, No. 5, pp. 878-911.,(1994)
Edward Lascher, Steven Kelman and Thomas J. Kane, "Policy Views, Constituency Pressure and Congressional Action on Flag-Burning" Public Choice, (1993) Vol. 79, pp. 79-102.,(1993)
David T. Ellwood and Thomas J. Kane, "The American Way of Aging: An Event History Analysis" in David A. Wise (ed.), Issues in the Economics of Aging, (Chicago: University of Chicago Press, 1990).,(1990)
Thomas J. Kane, "Giving Back Control: Long-Term Poverty and Motivation," Social Service Review (1987) Vol. 61, No. 3, pp. 405-419.,(1987)
Justine S. Hastings, Thomas J. Kane, Douglas O. Staiger and Jeffrey M. Weinstein, The Effect of Randomized School Admissions on Voter Participation Journal of Public Economics Vol. 91 (2007) 915937.
Thomas J. Kane, Let the Numbers Have Their Say: Evidence on Massachusetts' Charter Schools. Center for Education Policy Research at Harvard University, 2016. http://cepr.harvard.edu/files/cepr/files/evidence-ma-charter-schools.pdf