In-school supervised EdTech support produces massive learning gains: A Khan Academy field experiment in India


A fundamental challenge in education is that students are different. The average fifth-grade class contains students working at levels ranging from third grade to eighth grade, and this achievement gap only widens over time (Peters et al. 2017, Cascio and Staiger 2012, Nielsen 2023). The problem is particularly acute in developing countries. Hanushek and Woessmann (2022) document that at least two-thirds of the world’s youth fail to reach basic skill levels. India faces especially severe deficits, with eighth graders performing on average four grades below their enrolled level (Muralidharan and Singh 2025).

Tutoring offers an ideal solution by allowing students to progress at their own pace with individualised feedback (Beck 2007). Meta-analyses show tutoring generates impressive learning gains of 0.36 standard deviations (Dietrichson et al. 2017). But effective tutoring programmes cost thousands of dollars per student annually, making them difficult to scale (Oreopoulos et al. 2024, Strassberger and Condliffe 2024, White et al. 2023). The pandemic highlighted these constraints as districts struggled to deliver tutoring despite unprecedented funding (Fahle et al. 2024, Guryan and Ludwig 2023).

The promise and puzzle of computer-assisted learning

Computer-assisted learning has attracted attention precisely because it promises tutoring-like personalisation at far lower cost. Research on ‘teaching at the right level’ demonstrates that targeting instruction to students’ actual learning levels generates substantial gains (Banerjee et al. 2007, 2016, Duflo et al. 2011), and computer-assisted learning platforms can deliver this personalisation through adaptive algorithms. A meta-analysis found median effect sizes of 0.29 standard deviations for educational technology in developing countries (Rodriguez-Segura 2022), suggesting real potential for impact at scale.

Yet computer-assisted learning produces dramatically different outcomes depending on context. Some studies find negative or null effects (Morgan and Ritter 2002, Pane et al. 2010, 2014), while others find positive effects (Barrow et al. 2009, Roschelle et al. 2016, Copeland et al. 2023), with enormous variation even within the same study across different classrooms (Oreopoulos et al. 2024). The key appears to lie in implementation quality. As Hill and Erickson (2021) conclude, “low fidelity increases the likelihood of weak student outcomes” while “moderate fidelity may be enough to yield positive programme outcomes”.

Evidence from India reinforces this pattern. Muralidharan and Singh (2025) found that the personalised learning software Mindspark achieved gains of 0.22 standard deviations over 18 months when implemented with dedicated support structures, but when that support decreased, usage dropped sharply by half. This suggests that the binding constraint may not be the quality of the technology but the organisational capacity to ensure consistent, productive use.

Testing the implementation hypothesis

We tested this hypothesis in Uttar Pradesh (Oreopoulos et al. 2026). In 2022, Khan Academy partnered with 105 government boarding schools to launch a mathematics improvement programme. The partnership followed conventional approaches that previous research suggested would work: teacher training sessions, WhatsApp support channels, technical helplines, and monthly performance monitoring. Schools were encouraged to dedicate sessions to the platform, with targets of 120 minutes of monthly practice.

The results fell short of expectations. Only 44% of registered students used the platform even once during the entire year. Teachers facing competing demands found it difficult to consistently allocate time for Khan Academy amid other priorities, with practice sessions occurring once monthly or less rather than the intended weekly sessions. This experience suggested a different approach. If conventional support methods were insufficient, perhaps what was needed was dedicated personnel whose sole responsibility was ensuring implementation.

We designed a randomised controlled trial to test this directly. We randomly assigned 83 schools to treatment or control conditions over 31 weeks, covering 5,535 students in grades 6–8. Treatment schools received dedicated lab-in-charges whose full-time job was ensuring high-fidelity implementation. These staff guaranteed two Khan Academy sessions weekly, formally integrating them into school timetables. They trained students on basic digital literacy, monitored engagement, troubleshooted connectivity issues, and supported motivational campaigns. Control schools retained full Khan Academy access and initial training but lacked this dedicated implementation staff.

From implementation to learning gains

The contrast was striking. Figure 1 shows treatment students practiced 47.4 minutes per week compared to just 7.2 minutes in control schools, a 6.6-fold increase that sustained throughout the seven-month intervention. Practice time rose quickly to 50–60 minutes per week and remained elevated despite a temporary holiday dip, while control usage stayed consistently low at 5–15 minutes per week.

Figure 1 Sustained engagement gap between treatment and control schools

Notes: Weekly average platform usage in treatment (solid line) versus control (dashed line) schools over the 31-week intervention period (August 2024 to February 2025).
Source: Oreopoulos et al. (2026).

This sustained engagement translated into productive learning. Treatment students mastered nearly one additional skill per hour of practice compared to control students, showing that lab-in-charges ensured quality engagement rather than mere time on task. Students in treatment schools scored 0.44 to 0.47 standard deviations higher on independently administered mathematics assessments, representing a move from the 50th to approximately 67th percentile. This is equivalent to two to three years of typical schooling in low- and middle-income countries (Evans and Yuan 2019), with gains remarkably uniform across question difficulty levels and student subgroups. Figure 2 illustrates this comprehensive impact, showing that treatment and control groups began with nearly identical baseline score distributions but diverged substantially by the intervention’s end.

Figure 2 Treatment effect on the distribution of mathematics achievement

Notes: Baseline scores (left panel) show that treatment and control groups were balanced before the intervention. Endline scores (right panel) show that the treatment distribution shifted substantially rightward after 31 weeks.
Source: Oreopoulos et al. (2026).

These effect sizes exceed comparable interventions while demonstrating superior cost-effectiveness. Compared to Mindspark (Muralidharan and Singh 2025), we achieved double the effect in less than half the time, and exceeded the gains from high-dosage tutoring while costing only $24 per student annually, compared to thousands for tutoring.

Implementation capacity as a critical complement to the technology

What explains this success? The answer points to effective implementation as being the missing puzzle piece along with the technology. Recent research by Di Liberto et al. (2025) shows that management quality in schools significantly impacts student outcomes, and our results demonstrate this principle in the EdTech context. Dedicated personnel whose sole responsibility was ensuring implementation fidelity made the difference through protected curriculum time, active troubleshooting, quality monitoring, and motivational systems that lead to higher quality practice time and effective integration of Khan Academy with the classroom curriculum.

The generalisable lesson extends beyond our specific staffing model. Schools might reallocate existing teachers, hire part-time personnel, or deploy paraprofessionals. The key is that someone must be responsible and accountable for implementation, transforming computer-assisted learning from an optional supplement that gets perpetually deferred to mandatory curriculum time with dedicated support.

This insight matters now as governments invest in educational recovery following pandemic-related learning loss. The question is not simply which technologies to adopt but how to build organisational structures that guarantee implementation fidelity. Looking forward, emerging AI-powered tutoring systems may offer additional pedagogical benefits, but similar implementation structures will likely be needed regardless of technological sophistication. The challenge ahead is building computer-assisted learning programmes with the organisational capacity to translate platform access into sustained, productive learning.

References

Banerjee, A, R Banerji, J Berry, E Duflo, H Kannan, S Mukherji, M Shotland, and M Walton (2016), “Mainstreaming an effective intervention: Evidence from randomized evaluations of ‘teaching at the right level’ in India”, NBER Working Paper 22746.

Banerjee, A V, S Cole, E Duflo, and L Linden (2007), “Remedying education: Evidence from two randomized experiments in India”, Quarterly Journal of Economics 122(3): 1235–64.

Barrow, L, L Markman, and C E Rouse (2009), “Technology’s edge: The educational benefits of computer-aided instruction”, American Economic Journal: Economic Policy 1(1): 52–74.

Beck, R J (2007), “Towards a pedagogy of the Oxford tutorial”, Lawrence University.

Cascio, E, and D Staiger (2012), “Knowledge, tests, and fadeout in education interventions”, NBER Working Paper 18038.

Copeland, S, M A Cook, A A Grant, and S M Ross (2023), “Randomized-control efficacy study of IXL math in Holland public schools”, Johns Hopkins Center for Research and Reform in Education, Baltimore, MD.

Di Liberto, A, L Giua, F Schivardi, M Sideri and G Sulis (2025), “Leading schools, raising scores: The role of management in education”, VoxEU.org, 5 March.

Dietrichson, J, M Bøg, T Filges, and A-M K Jørgensen (2017), “Academic interventions for elementary and middle school students with low socioeconomic status: A systematic review and meta-analysis”, Review of Educational Research 87(2): 43–282.

Duflo, E, P Dupas, and M Kremer (2011), “Peer effects, teacher incentives, and the impact of tracking: Evidence from a randomized evaluation in Kenya”, American Economic Review 101: 1739–44.

Evans, D, and F Yuan (2019), “Equivalent years of schooling: A metric to communicate learning gains in concrete terms”, World Bank Group Policy Research Working Paper WPS 8752.

Fahle, E, T J Kane, S F Reardon, and D Staiger (2024), “The first year of pandemic recovery: A district-level analysis”, Center for Education Policy Research, Harvard University.

Guryan, J, and J Ludwig (2023), “Overcoming pandemic-induced learning loss”, in M S Kearney, J Schardin, and L Pardue (eds.), Building a More Resilient US Economy, Washington, DC: Aspen Institute.

Hanushek, E, L Woessmann and S Gust (2022), “A world unprepared: Missing skills for development”, VoxEU.org, 5 December.

Hill, H C, and A Erickson (2021), “Using implementation fidelity to aid in interpreting programme impacts: A brief review”, Annenberg Institute at Brown University, EdWorkingPaper 21-414.

Morgan, P, and S Ritter (2002), “An experimental study of the effects of cognitive tutor algebra I on student knowledge and attitude”, Carnegie Learning, Pittsburgh, PA.

Muralidharan, K, and A Singh (2025), “Adapting for scale: Experimental evidence on computer-aided instruction in India”, working paper.

Nielsen, E (2023), “The variance of achievement increases during childhood”, Federal Reserve Board of Governors Working Paper.

Oreopoulos, P, C Gibbs, M Jensen, and J Price (2024), “Teaching teachers to use computer assisted learning effectively: Experimental and quasi-experimental evidence”, NBER Working Paper 32388.

Oreopoulos, P, O Keyes-Krysakowski, and D Agarwal (2026), “How in-school supervised ed-tech support produces massive learning gains: A Khan Academy field experiment in India”, NBER Working Paper 34683.

Pane, J F, B A Griffin, D F McCaffrey, and R Karam (2014), “Effectiveness of cognitive tutor algebra I at scale”, Educational Evaluation and Policy Analysis 36(2): 127–44.

Pane, J F, D F McCaffrey, M E Slaughter, J L Steele, and G S Ikemoto (2010), “An experiment to evaluate the efficacy of cognitive tutor geometry”, Journal of Research on Educational Effectiveness 3: 254–81.

Peters, S J, K Rambo-Hernandez, M C Makel, M S Matthews, and J A Plucker (2017), “Should millions of students take a gap year? Large numbers of students start the school year above grade level”, Gifted Child Quarterly 61(3): 229–38.

Rodriguez-Segura, D (2022), “EdTech in developing countries: A review of the evidence”, The World Bank Research Observer 37(2): 171–203.

Roschelle, J, M Feng, R F Murphy, and C A Mason (2016), “Online mathematics homework increases student achievement”, AERA Open 2(4).

Strassberger, M, and B Condliffe (2024), “How to build it and ensure they will come: Educators’ advice on high-dosage tutoring programs”, MDRC, New York.

White, S, L Groom-Thomas, and S Loeb (2023), “A systematic review of research on tutoring implementation: Considerations when undertaking complex instructional supports for students”, Annenberg Institute at Brown University, EdWorkingPaper 22-652.



Source link

  • Related Posts

    Prostar Announces Closing of Convertible Debenture Financing

    Read More Source link

    Watch highlights from day 7 of the Winter Olympics

    IE 11 is not supported. For an optimal experience visit our site on another browser. Now Playing Watch highlights from day 7 of the Winter Olympics 01:58 UP NEXT Olympian…

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    Prostar Announces Closing of Convertible Debenture Financing

    Watch highlights from day 7 of the Winter Olympics

    Watch highlights from day 7 of the Winter Olympics

    Can Democrats post their way to midterm victories?

    Can Democrats post their way to midterm victories?

    Michigan vs. UCLA prediction, pick, odds, spread, where to watch live

    Michigan vs. UCLA prediction, pick, odds, spread, where to watch live

    Trump’s EPA repeal could mean push for more diversification in Canada

    Trump’s EPA repeal could mean push for more diversification in Canada

    Mark Carney, federal leaders join vigil in Tumbler Ridge

    Mark Carney, federal leaders join vigil in Tumbler Ridge