AI and Robotics (STAR lab)

We are the Surrey Technology for Autonomous systems and Robotics (STAR) Lab led by Professor Yang Gao.

A team of academic scholars, roboticists and computer scientists. It is our mission to build on the Surrey Space Centre (SSC) heritage of 'small-sat' engineering approach and extend this philosophy to advance autonomous systems and robotics for space.

The STAR Lab leads Surrey's involvement within the UK-RAS Network, and was an active participant and co-organizer to the UK Robotics Week. It is also the hosting research group of the EPSRC/UKSA national hub on Future AI & Robotics for Space (FAIR-SPACE). Other international community activities include leading the IEEE-CIS Task Force on Intelligent Space Systems and Operations, UK-RAS White Paper on Space RoboticsTAROS 2017, etc.

The STAR Lab also hosts a range of lab facilities, see the SSC robotics facilities webpage.

Enabling technologies

Our main research expertise and key research products include the following:

  • Autonomous software architecture, domain-independent, generic and reconfigurable, based on rational agents for complex space systems such as multi-satellite and multi-rover scenario
  • Advanced software agents for learning and planning capabilities
  • Ontology-based complex system modeling through SySML
  • Major EPSRC funded research projects: Autonomous & Intelligent Systems Programme, Impact Accleration Account, Strategic Collaboration Award, Capital Equipment Grant, ICT Grant on Indutrial 6th Sense, etc.

Graph

  • Visual sensing using optical cameras (monocular/stereo), 2D/3D LIDAR, and RGB-D sensors
  • Visual perception based on cognitive vision or saliency techniques for thematic feature extraction, sinkage prediction, and terrain classification
  • Planetary Monocular SLAM (PM-SLAM) for long range navigation of single or multiple rovers (recipient of IAF's 3AF Edmond Brun Silver Medal on this work)
  • Major funded research projects in relation to real-world space missions: ESA’s ExoMars PanCam and Phase A, UKSA’s CREST-1, NSTP2, and CREST-3, and Airbus/SSTL’s OOA.

Graph

  • GraphDeveloped low-cost attitude control for spinning space vehicles (such as kinetic penetrators or mulit-U cubesat) using minimum one actuator
  • Developed the state-of-the-art slew algorithms including Extended Half Cone, Dual Cone, Sector Arc, and Spin Synch
  • Major Airbus funded R&D on software and hardware testbed for under-actuated slew control.
  • Soil facilityDeveloped rigorous preparation methodologies and regolith test facilities for planetary soil simulants, ranging from Martian compressible soil to icy lunar regolith
  • Vision based approach for soil characterization and physical property prediction
  • Major funded research projects in relation to real-world space missions: ESA’s ExoMars sampling payload testing, ESA’s Lunar Polar Sample Return (LPSR) mission’s L-GRASP testing, CNSA’s Chang’E3 mission terrain image analytics.
  • Biomimetic mechanismPioneering bio-inspired Dual Reciprocating Drilling (DRD) technology, also known as the "wasp drill"
  • Deep drilling possible with flexible deployment mechanism in low-gravity environment, and low-mass sampling tool suitable for small space vehicles
  • Development of innovative surface mobility systems and simulators
  • Major funded R&D: ESA’s ACT grant on Bionics and Space Systems, NPI/OHB grant on innovative deep drilling, and RAEng project on low-cost sampling
  • Exhibition at National Science Museum 'Antenna' gallery
  • Recipient of COSPAR outstanding paper award 2016 on this work.

Key space systems

The technologies developed at STAR Lab are used to realise next generation spacecraft and vehicles, including rovers, robotic arms, wasp drills and penetrators.

  • Smart roverMultiple small to medium-size rovers with different chassises, e.g. mecanum wheeled, normal wheeled, tracked (range from 20 to 100kg each)
  • Robotic payloads: 6DOF robotic arm, gripper, drill sampler
  • Sensors: monocular and stereo camera, 2D and 3D LIDAR, IMU, differential GPS
  • Operating standardized mid-ware ROS for testing of modular autonomy functions and/or applications.
  • Wasp drillPlanetary drill inspired by wood wasp ovipositor mechanism
  • Based on Dual Reciprocating Drilling (DRD) which does not reply on overhead force to operate
  • Lighter weight, higher power efficiency than conventional drilling such as rotary and percussive
  • Advantageous for low-gravity mission senarios such as the Moon, asteroids and comets.
  • Micro penetratorThe probe (~10kg) free falls and penetrates into target planetary bodies at hundreds of metres per second, and a small angle of attack (less than 8 deg) is permitted
  • Fully instrumented with engineering and scientific payload
  • Involved in MoonLITE, LunarEX/NET missions and the UK Penetrator Consortium.

Space and non-space missions

ExoMars robotThe ExoMars mission aims to demonstrate key planetary robotics technologies such as rover autonomy and deep drilling. The STAR Lab was involved in the Phase A study, has been involved in the UK PanCam payload studies, and developing next generation rover autonomy and planetary drilling technologies, as well as advanced rover locomotion technologies in collaboration with Airbus DS. 

The follow-on studies have been looking at next-generation GNC for planetary rover to achieve faster travelling speed and longer traverse distance based on higher autonomy capabilities through robotic vision and machine learning techniques. Future missions that will benefit from these advancement include Mars or Phobos sample return mission.

Proba3 robotThe Proba3 mission aims to demonstrate key technologies that enable precision formation flying between two spacecraft for the first time, such as high accuracy measurement system.

The STAR Lab has been involved in the calibration of the Lateral and Longitudinal Sensor (FLLS) onboard one of the Proba3 spacecraft and its ground testing. FLLS could allow large-scale structures to be deployed and maintained in space, monitoring structural distortion before, during and after deployment, and providing in-flight corrections to data collection.

Examples of such future applications include in-orbit observatories, positioning of telecommunication satellite antennas, and deployable mechanisms on Moon or Mars missions.

Roscosmos robotThe LunaResurs mission aims to land at the lunar south pole to search for minerals and water. The STAR Lab is involved in developing the landing sensor called LEIA, a LIDAR instrument that enables the lander to avioid uneven terrain and land safely.

The LEIA (or LIDAR for Extra-terrestrial Imaging Applications) will provide a 3D map of the lunar surface from two altitudes during landing: 1.3 km and 250 m, with resolutions of 1.0 m and 0.1 m respectively. Due to launch in 2021, the LunaResurs will be the first mission applies most of the components in the LIDAR in space or on the lunar surface.

Subject to a rigorous test campaign, this mission will pave the way for more extensive applications of LIDAR technologies in future space missions.

Nuclear industry facilityThe STAR Lab has been transferring space robotics and autonomous systems technologies into the nuclear sector, by developing:

  • Reconfigurable autonomous software archtecture for operation of nuclear plants where design requirements and challenges are similar to relevant space applications
  • Customised robotic vision system for autonomous inspection and classification of nuclear waste going through the process of ‘sort-and-segregate’
  • Customised machine vision system for autonomously detection, measurement and tracking of the dynamic behaviours of the gas bubbles of the liquid pond in nuclear decomissioning sites.

Agriculture imageThe STAR Lab has been transferring and integrating space robotics and autonomou systems technologies into the agriculture sector, by developing:

  • Computer vision techniques to detect anomalies in crops and related agricultural applications including spectrum based segmentation and automatic selection using clustering algorithm. For examples, leaf close up image is segmented and analysed whereby diseased sections are identified and extracted and diseased / Healthy leaf metrics are calculated
  • Autonomous rovers and navigation capabilities applicable to agriculture robots
  • Integrating satellite, aerial and ground based data systems and robotic platforms.

Photo gallery

Video gallery