The objective of the study GENIAL was to explore how university students in full-time undergraduate and postgraduate courses use popular GenAI tools in their learning and assessment.
When ChatGPT and other conversational tools first emerged, the immediate reaction in higher education was one of disruption. However, we chose to view this shift as an opportunity to rethink how we teach. We set out to explore the practical applications of these tools to understand how their use affects critical thinking skills and learning.
We set out to fill a knowledge gap and obtain insights that could provide valuable direction in how we re-imagine higher education for the 21st century in light of these new technologies.
What began as a small focus group in 2023, evaluating code generation tools, quickly became an interdisciplinary, cross-departmental project. Recognising the growing urgency in the field, I helped turn the original initiative into a comprehensive research project that makes an impact beyond LSE.
The best bits:
Watching the project scale from a niche technical evaluation into a broad study of human behaviour was fascinating. It showed that, regardless of whether a student writes code, works on a change management scenario, or designs a policy memo, the fundamental challenge remains the same: how to use GenAI to augment, rather than bypass the hard work of thinking and learning.
Multidisciplinary research project to explore how university students use generative AI tools in their learning.
Over the 2023–24 academic year, we investigated how GenAI tools affect the learning behaviours of over 200 students across seven different undergraduate and postgraduate courses at LSE, spanning both quantitative and qualitative disciplines.
Participating Departments: LSE Departments of Statistics, Department of Management, Public Policy and the Data Science Institute.
Read more below:
We used various data collection methods to gather reliable and high-quality data.
During the first term we ran a survey at the end of, dedicated in-class activities, during which students were asked to work independently and to use the chatbots as an aid.
In the second term, our data collection was no longer restricted to the use of chatbots in class. We expanded our data collection efforts to include surveys and focus groups, and every week, we requested participants to share chat logs related to their learning and participation in the course, both in and out of the classroom. Furthermore, we obtained students’ assignment submissions and chat logs.
The project found that although GenAI tools can be a very helpful learning tool for some students, the growing over-reliance of university students on these tools for learning and assessment risks circumventing rather than enhancing the learning process. The biggest pedagogical challenge is that students may use the tools to replace their learning process and critical skills.
We argue that students may rely on GenAI differently for learning and for assessments, and that they tend to focus more on the output or performance than on the learning journey itself. We also observed that some students use GenAI platforms as a substitute for learning rather than as a tool to enhance learning.
Our findings raise questions about how GenAI can be successfully integrated into the curriculum without jeopardising learning and led to the development of policy recommendations focusing on curriculum planning and assessment design so that educators can adapt to these challenges and incorporate GenAI as an aid to learning.
The Genial Hub is a cross-disciplinary platform for AI governance best practices across LSE. Through the hub we organise events, talks, best practice sharing and network building.
The GENIAL Hub organised the launch of LSE’s partnership with Anthropic, marking a significant step in giving our students access to AI tools built specifically for a university setting.
We brought together student researchers, senior leadership, and guests from Anthropic for a day of working groups and workshops. Our goal was to move beyond the high-level talk of "AI in education" and actually demonstrate how these tools are being used by our students and faculty on the ground.
During the event, our expert panel discussed the future of AI in the classroom, taking questions from both LSE leadership and students on how we can use these technologies without compromising academic rigour.
The best bits:
Having our student ambassadors, students, researchers, university leaders and corporate partners in the same room was refreshing.
The event's inclusive character turned what could have been a standard corporate partnership launch event into a genuinely interesting and though-provoking dialogue.
It was a rare opportunity to bridge the gap between "Big Tech," university leadership, faculty and students. Having a critical and open conversation about the reality of these tools and their impact on learning felt like a real step forward in how we manage these partnerships.
Read more below:
Faculty discussion in the AI Working Group
Evening workshop
Claude for Education Demo and dialogue between students, faculty, senior leadership, our special guests Drew Bent and Greg Feingold.
Special thanks for all the hard work of our student ambassadors Zain Mirza, Allegra Chan Harrison, student researchers Hera Stamatiou, Vivien Kos, Anushka Jain and Juliette Lee.
Read more about the collaboration here.
How GenAI Amplifies Learning Inequalities and What We Can Do About It?
Presentation at the Cambridge Generative AI in Education Conference October 27-29, 2025
AI in Public Policy: AI in public policy: opportunities and challenges
5 November 2024
Mapping Student-GenAI Interactions onto Experiential Learning: The GENIAL Framework
Under review at a top technology& education journal
Approach Generative AI Tools Proactively or Risk Bypassing the Learning Process in Higher Education
Assessment and curriculum design can’t ignore how students use AI
The Times Higher Education
23 July 2024
To improve their courses, educators should respond to how students actually use AI
LSE Impact
22 May 2024