Autonomous Exploration with Virtual Reality Visualization
Published:
Project page / Project report / Video / Presentation / Blog
Students: Artemis Georgopoulo, Fabiano Junior Maia Manschein, Yasmina Feriel Djelil, Shani Israelov
The goal of this project is to combine autonomous exploration algorithms implemented in a mobile robot with a virtual reality environment for visualizing collected data. On the physical side, the SAMPO2 robot is outfitted with the Karelics Brain software and all the necessary hardware. The SLAM algorithm is already implemented and ready-for-use. While mapping the environment, the robot will move around autonomously according to goal poses set by the exploration algorithm. The user will be able to view the robot sensor data and the 2D map generated by the SLAM algorithm in the digital side, I.e., the Unity scene.
The project’s potential use cases include exploring unknown environments with the robot while a human user remotely monitors and possibly controls the robot and its navigation through the unknown. Natural disaster areas, for example, where potential victims are found and humans cannot traverse, are an example of an unknown environment. Another example is space exploration, in which the robot explores new planets or astral bodies.
SAMPO2 and the Karelics Brain software were provided by Karelics Ltd., a Finnish robotics software development company created in 2019, focused on robotics solutions for the construction industry. They claim to be the key link between robot manufacturers and construction companies, with their software making robots smart and easy-to-use.