
Dr Amir Ghalamzan
About
Biography
Amir holds the position of Associate Professor at the University of Surrey, Computer Science, leading the research in Intelligent Manipulation. His primary areas of focus encompass robot learning, robotic grasping and manipulation, teleoperation, Agri-food robotics, haptics, and tactile sensors.
Prior to his role at the University of Surrey, Amir served as an Associate Professor at the Lincoln Institute for Agri-Food Technology from 2021 to 2023, and as a Senior Lecturer at the School of Computer Science from 2018 to 2021, both at the University of Lincoln. Preceding this, he held the position of Senior Research Fellow at the University of Birmingham from 2015 to 2018.
Amir earned his Ph.D. in Robot Learning from Demonstration at Politecnico di Milano in 2015. He also holds a Second Level Specialization in Automatic Control Engineering from Politecnico di Torino (2011) and an M.Sc. in Mechanical Engineering-Systems, Vibration, and Control from Iran University of Science and Technology (2009).
His research interests lie in fundamental aspects of robot learning, and data-driven control and planning, with a strong commitment to addressing real-world challenges. Amir has particularly focused on mitigating labour shortages in the agri-food sector through the development of robots capable of functioning under extreme conditions, such as strawberry picking in high temperatures and humidity.
News
In the media
ResearchResearch projects
Past Research Projects
2020 - 2021 Cambridge Enterprise:
CERES Agtech funded Robofruit; £310K (PI) As a sole investigator, I spearheaded the development of a ground-breaking, fully autonomous strawberry-picking robot (SPR) that meets all food safety and standards requirements. The SPR incorporates state-of-the-art perception, AI, motion planning, and control components that were specifically designed and integrated to achieve optimal field performance. Our tireless efforts have resulted in a patent-protected invention that has the potential to revolutionise the agricultural industry. To validate the SPR’s capabilities, we conducted extensive testing in UoL’s strawberry polytunnel and Dyson Glasshouses, ensuring its readiness for real-world applications.
2021 - 2022 funded by IUK Fastpick; £150K (PI) As head of the UoL team, I played a vital role in developing a revolutionary perception and computing system for strawberry picking. The system was the first of its kind, utilising a private 5G network and edge server for computationally demanding processes. Collaborating with Saga Robotics and Robotfruit, we conducted real-time perception testing to validate its capabilities.
2021 - 2022 (PI) funded by IUK Bi-SENSS: £300K (PI) As the lead at UoL, I spearheaded the development of a dynamic and kinematic robot model, a grasping method using an RGB-D sensor, and motion planning and control for transforming DEXTER from a teleoperated system to an autonomous system. Our team worked closely with Veolia, the project coordinator, as well as Faculty.ai and other partners to ensure seamless integration of the high-level decision-making AI into the autonomous robotic system.
2021 - 2020 (PI) funded by Cancer Research UK ARTEMIS: £100K (PI) Under my leadership at UoL, we developed a pioneering learning-from-demonstration technique that allows a robot to learn the complex task of palpating a breast phantom directly from a human demonstration, without relying on any hand-designed features or computer vision techniques. Our approach directly maps visual information to robot action plans, resulting in highly efficient and accurate performance. Collaborating with colleagues at Imperial College London and the University of Bristol, we demonstrated the potential for using robotic systems for early breast cancer detection and screening.
2021 - 2021 (PI) funded by EPSRC via NCNR CELLO: £100K (PI) Haptic-guided mobile manipulation (CELLO): my team developed a novel haptic-guided method for mobile manipulation. This allows the teleoperation of autonomous cars as well as mobile manipulators much easier as the control of the system is shared between AI and human operators.
2023 - 2027 (Co-I) funded by IUK Agri Open-Core: £1,500K In this project, we aim to develop open-source software for robotic selective harvesting. I will contribute to this project by integrating motion planning into a few harvesting robots.
2017 - 2022 (Co-I) funded by EPSRC NCNR: £2,000K National Centre for Nuclear Robotics (NCNR): As one of the project supervisors, I oversaw a team of Ph.D. and postdoctoral researchers and developed novel haptic-guided teleoperation systems and data-driven control methods for physical robot interaction. Our goal was to enable robust grasping and manipulation of objects in high-consequence environments.
2019 - 2022 (Co-I) funded by IUK Grasp-berry: £300K As the supervisor, I led the team in the development of an innovative interactive motion planning method designed for handling complex tasks, including strawberry cluster manipulation during picking. We verified the efficacy of our approach using simulation techniques.
2020 - 2020 (Co-I) funded by IUK SBRI Competition Sort and Seg: £30K As the supervisor, I led the team in the development of an innovative interactive motion planning method designed for handling complex tasks, including strawberry cluster manipulation during picking. We verified the efficacy of our approach using simulation techniques.
2022 - 2022 (Co-I) funded by UKSA Space Debris removal: £30K
Research projects
2020 - 2021 Cambridge Enterprise:
CERES Agtech funded Robofruit; £310K (PI) As a sole investigator, I spearheaded the development of a ground-breaking, fully autonomous strawberry-picking robot (SPR) that meets all food safety and standards requirements. The SPR incorporates state-of-the-art perception, AI, motion planning, and control components that were specifically designed and integrated to achieve optimal field performance. Our tireless efforts have resulted in a patent-protected invention that has the potential to revolutionise the agricultural industry. To validate the SPR’s capabilities, we conducted extensive testing in UoL’s strawberry polytunnel and Dyson Glasshouses, ensuring its readiness for real-world applications.
2021 - 2022 funded by IUK Fastpick; £150K (PI) As head of the UoL team, I played a vital role in developing a revolutionary perception and computing system for strawberry picking. The system was the first of its kind, utilising a private 5G network and edge server for computationally demanding processes. Collaborating with Saga Robotics and Robotfruit, we conducted real-time perception testing to validate its capabilities.
2021 - 2022 (PI) funded by IUK Bi-SENSS: £300K (PI) As the lead at UoL, I spearheaded the development of a dynamic and kinematic robot model, a grasping method using an RGB-D sensor, and motion planning and control for transforming DEXTER from a teleoperated system to an autonomous system. Our team worked closely with Veolia, the project coordinator, as well as Faculty.ai and other partners to ensure seamless integration of the high-level decision-making AI into the autonomous robotic system.
2021 - 2020 (PI) funded by Cancer Research UK ARTEMIS: £100K (PI) Under my leadership at UoL, we developed a pioneering learning-from-demonstration technique that allows a robot to learn the complex task of palpating a breast phantom directly from a human demonstration, without relying on any hand-designed features or computer vision techniques. Our approach directly maps visual information to robot action plans, resulting in highly efficient and accurate performance. Collaborating with colleagues at Imperial College London and the University of Bristol, we demonstrated the potential for using robotic systems for early breast cancer detection and screening.
2021 - 2021 (PI) funded by EPSRC via NCNR CELLO: £100K (PI) Haptic-guided mobile manipulation (CELLO): my team developed a novel haptic-guided method for mobile manipulation. This allows the teleoperation of autonomous cars as well as mobile manipulators much easier as the control of the system is shared between AI and human operators.
2023 - 2027 (Co-I) funded by IUK Agri Open-Core: £1,500K In this project, we aim to develop open-source software for robotic selective harvesting. I will contribute to this project by integrating motion planning into a few harvesting robots.
2017 - 2022 (Co-I) funded by EPSRC NCNR: £2,000K National Centre for Nuclear Robotics (NCNR): As one of the project supervisors, I oversaw a team of Ph.D. and postdoctoral researchers and developed novel haptic-guided teleoperation systems and data-driven control methods for physical robot interaction. Our goal was to enable robust grasping and manipulation of objects in high-consequence environments.
2019 - 2022 (Co-I) funded by IUK Grasp-berry: £300K As the supervisor, I led the team in the development of an innovative interactive motion planning method designed for handling complex tasks, including strawberry cluster manipulation during picking. We verified the efficacy of our approach using simulation techniques.
2020 - 2020 (Co-I) funded by IUK SBRI Competition Sort and Seg: £30K As the supervisor, I led the team in the development of an innovative interactive motion planning method designed for handling complex tasks, including strawberry cluster manipulation during picking. We verified the efficacy of our approach using simulation techniques.
2022 - 2022 (Co-I) funded by UKSA Space Debris removal: £30K
Publications
Additional publications
- Modular autonomous strawberry picking robotic system, S Parsa, B Debnath, MA Khan, A Ghalamzan, Journal of Field Robotics, 2023
- Towards autonomous selective harvesting: A review of robot perception, robot design, motion planning and control, V Rajendran, B Debnath, S Mghames, W Mandil, S Parsa, S Parsons, A. Ghalamzan, Journal of Field Robotics, 2023
- Deep Functional Predictive Control for Strawberry Cluster Manipulation using Tactile Prediction, K Nazari, G Gandolfi, Z Talebpour, V Rajendran, P Rocco, A Ghalamzan, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2023.
- Proactive slip control by learned slip model and trajectory adaptation, K Nazari, W Mandil, A Ghalamzan, Conference on Robot Learning (CoRL) 2022.
- Deep Movement Primitives: toward Breast Cancer Examination Robot, O Sanni, G Bonvicini, MA Khan, PC López-Custodio, K Nazari, A. Ghalamzan, AAAI Conference on Artificial Intelligence 36 (11), 12126-12134, 2022.
- Action Conditioned Tactile Prediction: a case study on slip prediction, W Mandil, K Nazari, A Ghalamzan, The Robotics: Science and Systems (RSS) 2022
- Planning maximum-manipulability cutting paths, T Pardi, V Ortenzi, C Fairbairn, T Pipe, A Ghalamzan, R Stolkin, IEEE Robotics and Automation Letters (RA-L) 5 (2), 1999-2006, 2020.
- Haptic-guided shared control for needle grasping optimization in minimally invasive robotic surgery, M Selvaggio, A Ghalamzan, R Moccia, F Ficuciello, B Siciliano, 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).
- Robot learning from demonstrations: Emulation learning in environments with moving obstacles, A Ghalamzan, M Ragaglia, Robotics and Autonomous Systems (RAS), 2018.
- Guiding Trajectory Optimization by Demonstrated Distributions, T Osa, A Ghalamzan, R Stolkin, R Lioutikov, J Peters, G Neumann, Robotics and Automation Letters (RA-L) 2017, 1-1
- An incremental approach to learning generalizable robot tasks from human demonstration, A Ghalamzan, C Paxton, GD Hager, L Bascetta, IEEE international conference on robotics and automation (ICRA), 5616-5621, 2015.