AWARE: Adaptive World Models for Robust Robot Learning and Deployment
Scaling robot intelligence through physically grounded adaptive world models.
Start date
1 October 2026Duration
42 monthsApplication deadline
Funding information
- Home fees: £5,238
- UKRI Stipend: £21,805
- RTSG: £4,000 total.
About
The project
The next frontier of intelligent robotics will be driven by effective and efficient internal models of how the world evolves. This PhD will develop physically grounded world models to enable data-efficient and more generalisable robot learning while progressively aligning learning with real-world behaviour and accelerating deployment.
The goal: establish physically grounded world models as the foundation for scalable and deployable robot learning. You will work at the frontier of:
- Generative world models for embodied AI
- Physically grounded world models as adaptive training environments
- Imitation learning/reinforcement learning
- Spatial, geometric, and physical reasoning for robotics
- Robust policy generalisation across tasks and environments.
This is an opportunity to help redefine how robots learn, reason, and deploy safely at scale.
Academic environment
This studentship is based at the Centre for Vision, Speech and Signal Processing (CVSSP) at the University of Surrey.
CVSSP is:
- The largest UK research centre in its field
- Ranked 1st in the UK for CV, 3rd for AI, CV, and Robotics combined (csrankings.org)
- Home to world-leading research in computer vision, multimodal AI, generative modelling, and robotics.
You will join a vibrant and collaborative research group working at the intersection of:
- Generative world models
- Vision–Language–Action systems
- 3D scene modelling and simulation
- Real-world robotic learning and deployment.
The team provides strong mentoring, close collaboration, and access to high-performance compute infrastructure and real robotic platforms.
Why apply?
This PhD offers the opportunity to:
- Work at the forefront of generative AI and embodied robot learning
- Develop next-generation world models that directly impact real-world deployment
- Publish in leading robotics and AI venues (e.g., ICRA, RSS, CoRL, NeurIPS)
- Collaborate within one of the UK’s strongest AI and robotics research centres
- Build both academic excellence and real-world impact.
You will help shape how robots learn, reason, and safely operate in complex real-world environments.
Eligibility criteria
Applicants should hold (or expect to obtain) a minimum of a UK 2:1 honour degree (or international equivalent) in Robotics, Computer Science, Artificial Intelligence, Electronic Engineering, Mathematics, or a closely related discipline. A masters degree at Merit level or above (or international equivalent) in a relevant subject is desirable.
We are particularly interested in candidates with knowledge or experience in one or more of the following areas:
- Machine learning and deep learning
- Reinforcement learning or imitation learning
- Robot learning, control, or embodied AI
- Computer vision or 3D spatial representations
- Physics-based simulation or modelling.
Candidates should demonstrate:
- Strong programming skills (e.g., Python; experience with PyTorch, or similar frameworks is desirable)
- Good mathematical foundations (linear algebra, probability, optimisation)
- Motivation for independent research and problem-solving
- Strong written and verbal communication skills.
Prior experience with robotic systems, simulation platforms, or large-scale model training would be advantageous but is not essential.
Open to candidates who pay UK/home rate fees. See UKCISA for further information.
How to apply
Applications should be submitted via the Vision, Speech and Signal Processing PhD programme page. In place of a research proposal, you should upload a document stating the title of the project that you wish to apply for and the name of the relevant supervisor.
Studentship FAQs
Read our studentship FAQs to find out more about applying and funding.
Application deadline
Contact details
Peng Wang
Studentships at Surrey
We have a wide range of studentship opportunities available.