Robotic Exploration of an Unknown Nuclear Environment
This video shows the result of collaborative work between the University of Manchester in the UK and CSIRO in Australia as part of the RNE EPSRC programme grant. Together we built the first autonomous mobile robot that uses real-time measurements from a gamma radiation detector to assist the robot in planning paths which minimize the overall radiation dose that the robot receives during a mission. Open access paper available here: https://www.mdpi.com/2218-6581/10/2/78
IROS 2020 Video Presentation
This is the video presentation that accompanies our IROS 2020 conference paper (Model Identification of a Small Omnidirectional Aquatic Surface Vehicle: a Practical Implementation). In the paper we present a method capturing the physics (system model) of a small aquatic surface vehicle using only onboard sensors. We also present a new method of combining the forces generated by the thrusters to avoid dead zones in the thrusters.
VAM HRI 2020 Presentation
VAM HRI 2020 presentation by RNE’s Paul Bremner at Bristol Robotics Laboratory. The work was presented at the Second International Workshop on Virtual, Augmented and Mixed Reality for Human-Robot Interaction, and showcases an adaption of the well known heuristic evaluation methodology for use to evaluate the user interface for VR information-visualisation systems. The conference was held online in August 2020, see http://vam-hri.xyz/ for more info on the conference. The paper from this work can be viewed here.
ACM/IEEE International Conference on Human-Robot Interaction
ACM/IEEE International Conference on Human-Robot Interaction, video from BRL’s Tom Bridgwater. The video showcases the results from a study conducted in a virtual nuclear environment to test the effects of improving user trust in a robots decision making process, by incorporating a human approach to risk taking.
Land MallARD
A video showcasing Land-MallARD; a holonomic ground robot that mimics the physics (dynamic behaviour) of the surface vehicle, MallARD, which was developed in response to the IAEA’s 2017 Robotics Challenge. It does this by running a simulation of the MallARD’s dynamic model: the ground robot drives as it is an omnidirectional boat propelled by thrusters. This speeds up software development by removing the need for extensive pool testing, software is the same on the ground vehicle and the boat platform. This is better than pure simulation as it uses real sensors, the code remains the same and it serves as a low-cost solution.
UI Environment
Theme 2 from Bristol Robotics Laboratory have produced this video showcasing the UI Environment. This is a virtual reality based method to present environment mapping data, using a navigation and control scheme inspired by computer games. Objects are captured with an RGBD camera, and the point clouds from multiple captures are integrated to form a single point cloud. In this test environment, a set of objects have been captured individually and used to compose the environment: noise around each object is a consequence of the pointcloud integration procedure. The pointcloud environment can be viewed in either normal RGB mode, or with the points coloured according to their level of radiation: for this test environment radiation sources have been procedurally added. A user study was run in this initial test environment to investigate the efficacy of the design choices in creating an intuitive control system, and user ability to recognise objects from pointclouds (part of doing so necessitates filtering out the noise), and characterise the radiation sources.
Heterogeneous Robot Map Merging
A video showcasing lab based demonstration of autonomous map merging where the robots have different sensing capabilities. The video was filmed at RNE’s lab in West Cumbria, the Robotics for Extreme Environments Laboratory (REEL) which hosts cross institutional researchers. Bristol Robotics Laboratory (BRL) researcher and Visiting Academic at The University of Manchester, Dr Craig West, here demonstrates his latest progress on work towards heterogeneous robot team for nuclear inspection.
MallARD Update
A video update demonstrating the improved MallARD capability, following feedback from the IAEA’s 2017 Robotics Challenge in which MallARD came second overall. MallARD is an autonomous surface vehicle (ASV) designed for the inspection and monitoring of wet nuclear storage facility such as spent fuel pools or wet silos. The ASV developed is holonomic, uses a LiDAR for localisation and features a robust trajectory tracking controller.
MultiRobot Map Merge
A video demonstrating laboratory based testing of multi robot mapping of unknown areas. The video was filmed at RNE’s lab in West Cumbria- the Robotics for Extreme Environments Laboratory (REEL), which is a cross institutional lab hosting members from The University of Manchester and Bristol Robotics Laboratory. This work was carried out by BRL researcher and Visiting Academic at UoM, Dr Craig West.
MallARD IAEA 2017 Entry
The University of Manchester’s researcher Dr Keir Groves designed and built the autonomous surface vehicle (ASV) for the IAEA’s Robotics Challenge 2017, which came in the top three of the second round in Nov 2017. The MallARD went on to compete in a final 3rd round where it was deployed in a spent fuel pond at a nuclear power plant in Finland by the IAEA, along with two other entries. The MallARD came second overall, in November 2018.