by Andrew Leigh
26Gueron & Rolston, Fighting for Reliable Evidence, pp. 302–3.
27Gueron & Rolston, Fighting for Reliable Evidence, p. 306.
28Henry J. Aaron, Politics and the Professors: The Great Society in Perspective, Washington DC: Brookings Institution, 1978, p. 159.
29Don Winstead, quoted in Gueron & Rolston, Fighting for Reliable Evidence, p. 301.
30Gueron & Rolston, Fighting for Reliable Evidence, p. 57.
31Andy Feldman, ‘Fighting for reliable evidence: An interview with Judith Gueron, MDRC, and Howard Rolston, Abt Associates’, Gov Innovator podcast, Episode 32, 10 October 2013.
32Gueron, ‘Fostering research excellence’.
33Feldman, ‘Fighting for reliable evidence’.
5 LEARNING HOW TO TEACH
1Quoted in Shalom M. Fisch & Rosemarie T. Truglio (eds), G is for Growing: Thirty Years of Research on Children and Sesame Street, Routledge, 2014, p. xi
2This issue is discussed in Melissa S. Kearney & Phillip B. Levine, ‘Early childhood education by MOOC: Lessons from Sesame Street’, NBER Working Paper No. 21229, Cambridge, MA: NBER, 2015.
3Gerry Ann Bogatz & Samuel Ball, The Second Year of Sesame Street: A Continuing Evaluation, Princeton, NJ: Educational Testing Service, 1971.
4Joan Cooney, 2001, ‘Foreword’ in Fisch & Truglio, G is for Growing, pp. xi–xiv. The specific examples on Sesame Street curriculum are drawn from Rosemarie Truglio, Valeria Lovelace, Ivelisse Seguí & Susan Scheiner, ‘The varied role of formative research: Case studies From 30 years’ in Fisch & Truglio, G is for Growing, pp. 61–82.
5Alison Gopnik, The Philosophical Baby: What Children’s Minds Tell Us About Truth, Love, and the Meaning of Life, New York: Picador, 2010, p. 11.
6This section draws heavily on Emily Hanford (edited by Catherine Winter), ‘Early Lessons’, American RadioWorks, 2009. Transcript available at http://americanradioworks.publicradio.org/features/preschool/
7David P. Weikart, ‘Preliminary results from a longitudinal study of disadvantaged preschool children’, paper presented at the 1967 Convention of the Council for Exceptional Children, St. Louis, Missouri.
8Weikart, ‘Preliminary results’.
9Lawrence J. Schweinhart, Jeanne Montie, Zongping Xiang, et al., Lifetime Effects: The High/Scope Perry Preschool Study Through Age 40, Ypsilanti, MI: High/Scope Press, 2005.
10James J. Heckman, Seong Hyeok Moon, Rodrigo Pinto, et al., ‘The rate of return to the HighScope Perry Preschool Program’, Journal of Public Economics, vol. 94, no. 1, 2010, pp. 114–28 (‘the benefit–cost ratio for the Perry program, accounting for deadweight costs of taxes and assuming a 3% discount rate, ranges from 7 to 12 dollars’).
11In Chicago, a team of economists even set up their own research preschool, the Chicago Heights Early Childhood Center, which operated from 2010 to 2014. The centre randomly assigned young children either to a cognitive stream, which focused on reading, writing and basic numeracy, or to a non-cognitive curriculum, which focused on social skills such as sitting still, executive functioning and expanding working memory. See Steven Levitt, quoted in Stephen J. Dubner, ‘Does “early education” come way too late?’, Freakonomics Radio, 19 November 2015; Roland Fryer, Steven Levitt, John List & Anya Samak, ‘Chicago Heights Early Childhood Center: Early results from a field experiment on the temporal allocation of schooling’, IRP Seminar Presentation, September 2013.
12Frances A. Campbell, Elizabeth P. Pungello, Margaret Burchinal, et al., ‘Adult outcomes as a function of an early childhood educational program: An Abecedarian Project follow-up’, Developmental Psychology, vol. 48, no. 4, 2012, p. 1033; ‘Abecedarian International’, Early Developments: Frank Porter Graham Child Development Institute, vol. 15, no. 1, 2014, pp. 12–15.
13Frances Campbell, Gabriella Conti, James J. Heckman, et al., ‘Early childhood investments substantially boost adult health’, Science, vol. 343, no. 6178, 2014, pp. 1478–85.
14Thomas Rae & Melanie J. Zimmer-Gembeck, ‘Behavioral outcomes of parent-child interaction therapy and Triple P–Positive Parenting Program: A review and meta-analysis’, Journal of Abnormal Child Psychology, vol. 35, no. 3, 2007, pp. 475–95.
15Karen M.T. Turner, Mary Richards & Matthew R. Sanders, ‘Randomised clinical trial of a group parent education programme for Australian Indigenous families’, Journal of Paediatrics and Child Health, vol. 43, no. 6, 2007, pp. 429–37.
16The Incredible Years Basic Parenting program has several versions, which vary between twelve and eighteen sessions. For results of the randomised trial, see Sinead McGilloway, Grainne Ni Mhaille, Tracey Bywater, Mairead Furlong, Yvonne Leckey, Paul Kelly, Catherine Comiskey & Michael Donnelly, ‘A parenting intervention for childhood behavioral problems: a randomized controlled trial in disadvantaged community-based settings’, Journal of consulting and clinical psychology, vol. 80, no. 1, 2012, p. 116. A randomised evaluation of the UK parenting program SureStart also showed positive results: Judy et al., ‘Parenting intervention in Sure Start services for children at risk of developing conduct disorder: Pragmatic randomised controlled trial’, British Medical Journal, vol. 334, no. 7595, 2007, pp. 678–82.
17Donal O’Neill, Sinéad McGilloway, Michael Donnelly, et al., ‘A cost-benefit analysis of early childhood intervention: Evidence from an experimental evaluation of the Incredible Years Parenting Program’, Working Paper n207-10, Maynooth: Department of Economics, Finance and Accounting, National University of Ireland, 2010.
18Two meta-analyses of nurse home visits are Denise Kendrick, Ruth Elkan, Michael Hewitt, et al., ‘Does home visiting improve parenting and the quality of the home environment? A systematic review and meta analysis’, Archives of Disease in Childhood, vol. 82, no. 6, 2000, pp. 443–51; Monica A. Sweet & Mark I. Appelbaum, ‘Is home visiting an effective strategy? A meta-analytic review of home visiting programs for families with young children’, Child Development, vol. 75, no. 5, 2004, pp. 1435–56.
19Megan H. Bair-Merritt, Jacky M. Jennings, Rusan Chen, et al., ‘Reducing maternal intimate partner violence after the birth of a child: A randomized controlled trial of the Hawaii Healthy Start Home Visitation Program’, Archives of Pediatrics & Adolescent Medicine, vol. 164, no. 1, 2010, pp. 16–23; Jamila Mejdoubi, Silvia CCM van den Heijkant, Frank J.M. van Leerdam, et al., ‘Effect of nurse home visits vs. usual care on reducing intimate partner violence in young high-risk pregnant women: a randomized controlled trial,’ PloS one, vol. 8, no. 10, 2013, e78185. I am grateful to Cathryn Stephens for bringing this research to my attention.
20See the comparison of randomised and quasi-experimental results on page 1441 of Monica A. Sweet & Mark I. Appelbaum, ‘Is home visiting an effective strategy? A meta-analytic review of home visiting programs for families with young children’, Child Development, vol. 75, no. 5, 2004, pp. 1435–56.
21Dana Suskind, Thirty Million Words: Building a Child’s Brain, New York: Penguin, 2015, p. 52.
22For a critique of the misuse of neuroscience in the development of early childhood programs, see Zoe Williams, ‘Is misused neuroscience defining early years and child protection policy?’, Guardian, 26 April 2014. The ‘1001 Critical Days’ idea is outlined in A. Leadsom, F. Field, P. Burstow & C. Lucas, ‘The 1001 Critical Days: The importance of the conception to age two period’, London, 2013.
23The West Heidelberg experiment is a collaboration between the Children’s Protection Society, Melbourne University Department of Economics and the Murdoch Children’s Research Institute. The study protocol is available at Brigid Jordan, Yi-Ping Tseng, Nichola Coombs, et al., ‘Improving lifetime trajectories for vulnerable young children and families living with significant stress and social disadvantage: the early years education program randomised controlled trial’, BMC Public Health, vol. 14, no. 1, 2014, p. 1. For a comparison of the treatment and control children at the start of the study, see Yi-Ping Tseng, Brigid Jordan, Jeff Borland, et al., Changing the Life Trajectories of Australia’s Most Vulnerable Children, Report No. 1:
Participants in the Trial of the Early Years Education Program, Melbourne: University of Melbourne and Children’s Protection Society, 2017.
24This story is told in Alice Hill, Brigid Jordan, Nichola Coombs, et al., ‘Changing life trajectories: The early years education research project’, Insights: Melbourne Business and Economics, vol. 10, 2011, pp. 19–25.
25This section is based on the author’s interviews with educators, researchers and parents at the centre.
26Thomas D. Cook & Monique R. Payne, ‘Objecting to the objections to using random assignment in educational research’, in Frederick Mosteller & Robert Boruch (eds), Evidence Matters: Randomized Trials in Education Research, Washington DC: Brookings Press, 2002, pp. 150–78.
27The Australian Productivity Commission recently recommended that ‘Australia needs to invest, particularly in randomised controlled trials, to build the Australian evidence base on what works best to improve education outcomes.’ Productivity Commission, National Education Evidence Base, Draft Report, Canberra: Productivity Commission, 2016, p. 16.
28Parsing out the effect of schools and families is methodologically tricky. See, for example, James Coleman, Equality of Educational Opportunity, Washington DC: National Center for Educational Statistics, 1966; Eric Hanushek, ‘What matters for student achievement’, Education Next, vol. 16, no. 2, 2016, pp. 19–26; OECD, Learning for Tomorrow’s World – First Results from PISA 2003, Paris: OECD, 2004, pp. 159–205.
29PISA began testing in different subjects at different times, and did not always cover the same set of nations. Comparing average results in the first tested year with results in 2015, mathematics scores have fallen by 8 points in the OECD-30 since 2003; reading scores have fallen by 1 point in the OECD-28 since 2000; and science scores have fallen by 5 points in the OECD-35 since 2006.
30See www.afterschoolalliance.org/policy21stcclc.cfm.
31Neil Naftzger, Seth Kaufman, Jonathan Margolin & Asma Ali, ‘21st Century Community Learning Centers (21st CCLC) Analytic Support for Evaluation and Program Monitoring: An Overview of the 21st CCLC Program: 2004–05’, Report prepared for the U.S. Department of Education, Naperville, IL: Learning Point Associates, 2006.
32Susanne James-Burdumy, Mark Dynarski & John Deke, ‘After-school program effects on behavior: Results from the 21st Century Community Learning Centers program national evaluation’, Economic Inquiry, vol. 46, no. 1, 2008, pp. 13–18.
33For evidence on the impact of after-school programs on academic results, see Susanne James-Burdumy, Mark Dynarski, Mary Moore, et al., ‘When schools stay open late: The national evaluation of the 21st Century Community Learning Centers program: Final report’, US Department of Education, National Center for Education Evaluation and Regional Assistance. Available at www.ed.gov/ies/ncee.
34Quoted in Ron Haskins, ‘With a scope so wide: using evidence to innovate, improve, manage, budget’ in Productivity Commission, Strengthening Evidence-based Policy in the Australian Federation: Roundtable Proceedings, Canberra, 17–18 August 2009, Vol. 1, Canberra: Productivity Commission, 2010, p. 46.
35For each child served, the estimated costs of these programs are: nurse home visits $11,394, high-quality early childhood $10,396, intensive reading support for third graders $3390, and an evidence-based programs to reduce teen pregnancy $763 (all in 2017 dollars). For cost estimates (in 2008 dollars) and the evidence base behind each program, see Julia B. Isaacs, Cost-Effective Investments in Children, Budgeting for National Priorities Project, Washington DC: Brookings Institution, 2007.
36Detailed descriptions of these evaluations may be found at educationendowmentfoundation.org.uk. The respective programs’ names are ‘One-to-One Academic Tuition’, ‘Switch on Reading’, ‘Mathematics Mastery’ and ‘Philosophy for Children’.
37The Education Endowment Foundation’s conversion to months of achievement is based on the assumption that students learn at a rate of one standard deviation per year: S. Higgins, D. Kokotsaki & R. Coe, 2012, ‘The teaching and learning toolkit: Technical appendices’, Education Endowment Foundation, The Sutton Trust. I know of no evidence that British students progress this rapidly. Standard estimates put learning progress at between one-quarter and one-half a standard deviation per year (see, for example, Andrew Leigh, ‘Estimating teacher effectiveness from two-year changes in students’ test scores’, Economics of Education Review, vol. 29, no. 3, 2010, pp. 480–8. This does not change the relative impact of interventions. However, it does suggest that the EEF’s impacts, when expressed in months of learning, are a lower bound. The true impacts might be two or four times as large.
38For Maths Mastery, the EEF reports a two-month gain for primary students, and a one-month gain for secondary students. They report the cost as £131 per year for primary school pupils and around £50 per year for secondary school pupils. I average these numbers to arrive at a cost of £60 to get one month’s improvement.
39For more details on the ‘Chatterbooks’ evaluation, see www.educationendowmentfoundation.org.uk
40William Earhart, ‘The value of applied music as a school subject’ In Papers and Proceedings of the Music Teachers National Association Forty-First Annual Meeting, Hartford: Music Teacher National Association, 1920, pp. 163–70. Earhart served as national president in 1915–16, at a time when the organisation was known as the Music Supervisors’ National Conference.
41For more details on the ‘Act, Sing, Play’ evaluation, see www.educationendowmentfoundation.org.uk
42From 2002 to 2013, the study identified ninety randomised evaluations, of which eleven (12 per cent) produced positive effects, while seventy-nine (88 per cent) produced weak or no positive effects. Among a subset of seventy-seven well-conducted randomised trials (that is, without problems such as differential attrition or inadequate statistical power), researchers found that seven (9 per cent) produced positive effects, while seventy (91 per cent) produced weak or no positive effects. See Coalition for Evidence-Based Policy, ‘Randomized controlled trials commissioned by the Institute of Education Sciences since 2002: How many found positive versus weak or no effects’, July 2013.
43Robert E. Slavin, ‘Evidence-based reform is irreversible’, Huffpost Education Blog, 22 October 2015.
44The other requirement to get the top rating is that the experiment has a low attrition rate. See What Works Clearinghouse, Procedures and Standards Handbook, Version 3.0, p. 9. Available at http://ies.ed.gov/ncee/wwc/.
45Joseph P.Allen, Robert C. Pianta, Anne Gregory, et al., ‘An interaction-based approach to enhancing secondary school instruction and student achievement’, Science, vol. 333, no. 6045, 2011, pp. 1034–7. The researchers report an impact of 0.22 standard deviations. Since students gain approximately one standard deviation every two years, this impact is equivalent to around half a year of student learning. See also Bill and Melinda Gates Foundation, Seeing it Clearly: Improving Observer Training for Better Feedback and Better Teaching, Washington DC: Gates Foundation, 2015, p. 11.
46Maya Escueta, Vincent Quan, Andre Joshua Nickow & Philip Oreopoulos, ‘Education technology: An evidence-based review’, NBER Working Paper No. 23744, Cambridge, MA: NBER, 2017.
47Escueta, ‘Education Technology’.
48The share of pupils completing their matriculation exams rose from 18 per cent in control schools to 25 per cent in treatment schools: Joshua Angrist & Victor Lavy, ‘The effects of high stakes high school achievement awards: Evidence from a randomized trial’, American Economic Review, vol. 99, no. 4, 2009, pp. 1384–414.
49Simon Burgess, Raj Chande & Todd Rogers, ‘Texting parents’, Working Paper, Education Endowment Foundation, London, 2016, available at www.educationendowmentfoundation.org.uk
50Todd Rogers & Avi Feller, ‘Intervening through influential third parties: Reducing student absences at scale’, working paper, Cambridge, MA: Harvard University Kennedy School, 2017.
51Paul Tough, 2008, Whatever It Takes: Geoffrey Canada’s Quest to Change Harlem and America, New York: Houghton
Mifflin, pp. 21–9
52Will Dobbie & Roland G. Fryer Jr., ‘Are high-quality schools enough to increase achievement among the poor? Evidence from the Harlem Children’s Zone’, American Economic Journal: Applied Economics, vol. 3, no. 3, 2011, pp .158–87; Will Dobbie & Roland G. Fryer Jr., ‘The medium-term impacts of high-achieving charter schools’, Journal of Political Economy, vol. 123, no. 5, 2015, pp. 985–1037.
53Quoted in David Brooks, ‘The Harlem Miracle’, New York Times, 7 May 2009, p. A31
54Betty Hart and Todd Risley, Meaningful Differences in the Everyday Experience of Young American Children, Paul Brookes: Baltimore, MD, 1995. Among the limitations of the study was that it only focused on 42 families, each of whom were observed for an hour per month over a 30-month period. The 30 million word estimate assumes that the children in the sample were representative of their respective socio-economic groups, and that the observed word counts can be linearly extrapolated. Although it did not garner the same headlines, an equally interesting finding of the study was that advantaged children receive 6 encouragements to 1 discouragement, while disadvantaged children receive 1 encouragement to 2 discouragements.
55The initiative reports: ‘We’ve recently completed a randomized control trial of the TMW curriculum with parents from Chicago’s South Side. The treatment group received education during eight weekly one-hour home visits. The control group received a nutrition intervention during eight weekly five- to ten-minute home visits. All participants completed fourteen LENA recordings, although only the treatment group received quantitative linguistic feedback. Participants receiving the TMW intervention significantly increased their talk and interaction with their children. Publication of this study is forthcoming.’: http://thirtymillionwords.org/tmw-initiative/.
56Steven D. Levitt, John A. List, Susanne Neckermann & Sally Sadoff, ‘The behavioralist goes to school: Leveraging behavioral economics to improve educational performance’, American Economic Journal: Economic Policy, vol. 8, no. 4, 2016, pp. 183–219.