The Latest Takes on Generative AI in Education

May 10, 2024 | By Rebecca Griffiths & Kerry Friedman

What is generative AI?
Generative AI is a class of artificial intelligence algorithms and models that that can generate new content (such as images, audio and text) based on patterns, structures and styles it has learned from an existing dataset. It is the next step in AI beyond traditional machine learning, which largely serves to observe and classify content based on predictive models.

Generative AI is taking center stage at many education conferences this year. Educators, administrators, developers, investors, policymakers, and researchers across the land are trying to figure out what these technologies mean for the future. We attended a series of large conferences and smaller events in 2024, including the ASU-GSV Summit, OLC Innovate, and the Knowledge Alliance Forum1, and have distilled some observations that may be helpful for those unsure what to make of this latest tsunami.

Our sense is that, amidst the frenzy of investor and entrepreneurial enthusiasm, attitudes towards generative AI among practitioners and researchers are more a flavor of cautious curiosity. We recognize that generative AI is poised to have dramatic impacts across many fields of human endeavor, but the potential harms seem to be crystalizing more quickly than transformative benefits in education.

There are two categories of benefits that are easiest to conceptualize at this early stage. The first is that the long-heralded potential of technology-enabled personalized learning may finally come to fruition. Generative AI systems that can interpret and produce natural language, both oral and written, could bring us closer to replicating the “two sigma” benefits of an individual human tutor famously documented in Benjamin Bloom’s 1984 study. Bloom found that the average student tutored one-to-one performed two standard deviations better than students educated in a classroom. AI tutors hold the potential to reproduce this impact at scale through one-to-one support customized for individual students’ interests, contexts, and learning abilities.

The second is process improvement, which sounds banal but could be quite significant if the automation of rote tasks frees up time and resources for perhaps our most valuable process in education: teachers engaging in meaningful interactions with students. AI tools have the potential to relieve some of the administrative burdens associated with teaching, such as grading, designing assessments, and creating slides or study materials. Products like Kyron Learning and AI Tutor Pro can generate lesson plans and PowerPoint slides that teachers can then refine, demonstrating how AI tools can make teachers’ jobs easier without replacing them. Other applications could streamline back-office processes.

Child at laptop with imagery of Ai floating

Educational agencies and institutions are, appropriately, moving cautiously, and most implementations involve more “vanilla” forms of AI such as predictive analytics. K-12 observers report pockets of innovation, but, for the most part, uses of generative AI are not systemic or addressing organizational processes. Most policy conversations are happening at the state level, but policymakers are overwhelmed by questions surrounding generative AI, on top of all the other issues they face. (To address this challenge, TeachAI.org recently released a policy-related toolbox for different levels of policymakers.)

The postsecondary sector is somewhat further along in adopting AI technologies, according to presenters from Deloitte Consulting’s higher education practice. Some colleges and universities are already using AI for backend systems and are most interested in using generative AI to “supercharge people” rather than replace them. Many institutions are focused on establishing organizational foundations for AI such as governance and processes for reviewing use cases and outcomes. Some institutions have created policies for academic integrity, but classroom use is otherwise largely up to individual faculty members. Administrators are wary of using generative AI with external audiences such as prospective students and donors because the behavior of AI models is still unpredictable.

Speakers at ASU-GSV emphasized that educators are already overwhelmed by the glut of ed-tech products and find that many overpromise and underdeliver. There is a risk of further alienating educators with a surge of generative AI products that are not grounded in learning science or evidence. The asymmetric incentives for start-up ed-tech companies—for whom the upside gains of rapid growth outweigh the downside risks of poor performance or safety issues—could escalate in the current gold rush climate. Another reason for educators to proceed with caution!

Most importantly, we should not forget what we have learned about educational technologies over the past few decades. Throwing technology over the wall into schools or developing products around the affordances of technology has rarely proven effective. To benefit students, ed tech needs to address a problem of practice and be integrated into a coherent instructional system (e.g., Elmore, 2004). The ways in which students learn have not changed. Generative AI cannot create synaptic connections in students’ brains, nor does it alleviate trauma, discrimination, food and housing insecurity, or other barriers that students from systemically marginalized backgrounds continue to face.

On the other hand, generative AI introduces some new dangers. Across conferences and events, many speakers recognized the risks that misuse of generative AI tools will undermine students’ development of critical thinking and personal expression skills. The potential for reproducing biases was a major concern, along with trustworthiness and risks to privacy.

What are some solutions? Recent developments with use of synthetic data (i.e., datasets generated by AI models themselves) to train models could be a way to prevent biases and avoid infringing intellectual property rights. Purposeful curation of training datasets is another promising pathway; we learned of a project where students documented the history of their small town to feed into AI models to make sure that their culture and history was preserved. Keeping humans in the loop may help to detect and remediate biased algorithms. Additionally, the approach of “Designing To the Margins” can result in tools that prioritize the needs of marginalized communities and are simply better for everyone. These sorts of intentional efforts to shape the technologies of the future will be essential to ensure we do not continue to perpetuate injustices and blind spots of the past.

A common theme across K-12 and postsecondary education is that students are ahead of the adults. One panelist at ASU-GSV mentioned the idea of cross-generational alliances—the need to work alongside younger generations, including current students involved in development initiatives and policy formation related to AI. We like this approach and believe that educators should also take an active role in shaping the ways in which students use AI. Educators and students are uniquely positioned to help design AI in ways that augment the teaching and learning experience, reduce barriers, and enhance opportunities for students to build critical thinking and self-expression capacities.

At one event, a speaker from Google posited that we are at the mid-point of exponential growth in AI understanding. Within one to two years, he said, we will have a much different perspective on the power and limits of AI. For now, we at SRI will seek opportunities to build AI literacy within education and our understanding of how to leverage generative AI affordances in service of evidence-based education.


1 The Knowledge Alliance Forum is a semiannual gathering for a diverse community of researchers, policymakers, thought leaders, and funders to share insights, collaborate, develop professionally, and foster positive change in the education sector.

Tags: Education technology Educators and systems leaders Evidence-based Policymakers Research & Developers