AI could undermine meaningful learning unless feedback stays rooted in connection, researchers recommend
The rise of generative AI in higher education is reshaping how feedback is delivered, but meaningful learning could be undermined if its use is not carefully guided by principles of care, trust and connection, according to new research led by the University of Surrey.
Published in Assessment & Evaluation in Higher Education, the paper explores how generative AI technologies, including chatbots such as ChatGPT, are transforming feedback for students – highlighting both the opportunities and risks of AI in education. While AI can generate responses at speed and scale, researchers argue it cannot fully replicate the judgement and relationships that make feedback effective.
Instead, they call for a “care-full” approach – one that treats feedback not as a set of comments, but as an ongoing process of dialogue, reflection and growth. Without this, they warn, it risks reducing feedback to a transactional exercise rather than a meaningful part of learning.
The paper also finds that while AI-generated feedback can be useful, students tend to place greater trust in feedback from human educators. Human feedback is more likely to be acted upon because it reflects better understanding, empathy and context – qualities AI cannot replicate.
At the same time, the research team note that AI can play a valuable complementary role. For some students, it offers a low-pressure way to explore ideas or seek feedback with reduced fear of judgement. However, over-reliance on AI may reduce meaningful interaction between students and educators and could even exacerbate existing educational inequalities if some students benefit more than others.
The research builds on an international manifesto developed by the team, which sets out ten principles for feedback in the age of AI. These are:
Feedback is a process, not corrective comments
Feedback is a relational practice
Feedback can be messy, uncomfortable, challenging and joyous.
Feedback should be an ethical practice
Feedback should promote learning over time
Feedback and associated technologies should be designed in conversation with learners and educators
Feedback engagement requires time and care
Learning, not technological efficiency or compliance, should drive thinking and decision-making regarding feedback processes
Feedback can be enhanced by digital technologies, but digital technologies do not always enhance feedback
More feedback is not necessarily better for learning
To ensure meaningful learning, the authors suggest generative AI should be integrated thoughtfully into education, alongside a continued commitment to “care-full” feedback practices that are continually evaluated and evolved, while upholding equity, professional expertise and connection.
###
Notes to editors
- Professor Naomi Winstone is available for interview; please contact mediarelations@surrey.ac.uk to arrange.
- The full paper can be found here: https://www.tandfonline.com/doi/ref/10.1080/02602938.2026.2643333?scroll=top
Related sustainable development goals
Media Contacts
External Communications and PR team
Phone: +44 (0)1483 684380 / 688914 / 684378
Email: mediarelations@surrey.ac.uk
Out of hours: +44 (0)7773 479911