Robots will be paired with a versatile AI that can quickly adapt to unpredictable conditions when examining underwater infrastructure.
Some of a nation’s most vital infrastructure hides beneath the water. The difficulty in accessing most of it, however, makes important damage checks infrequent.
Sending humans down requires significant training and can take several weeks to recover due to the often extreme depths. There are far more underwater structures than skilled divers to inspect them.
Robots have been designed to carry out some of these dangerous tasks. The problem is until now they’ve lacked the smarts to deal with the unpredictable and rapidly-changing nature of underwater conditions.
Researchers from Stevens Institute of Technology are working on algorithms which enable these underwater robots to check and protect infrastructure.
Their work is led by Brendan Englot, Professor of Mechanical Engineering at Stevens.
“There are so many difficult disturbances pushing the robot around, and there is often very poor visibility, making it hard to give a vehicle underwater the same situational awareness that a person would have just walking around on the ground or being up in the air,” says Englot.
Englot and his team are using reinforcement learning for training algorithms. Rather than use an exact mathematical model, the robot performs actions and observes whether it helps to attain its goal.
Through a case of trial-and-error, the algorithm is updated with the collected data to figure out the best ways to deal with changing underwater conditions. This will enable the robot to successfully manoeuvre and navigate even in previously unmapped areas.
A robot was recently sent on a mission to map a pier in Manhattan.
“We didn’t have a prior model of that pier,” says Englot. “We were able to just send our robot down and it was able to come back and successfully locate itself throughout the whole mission.”
The robots use sonar for data, widely regarded as the most reliable for undersea navigation. It works similar to a dolphin’s echolocation by measuring how long it takes for high-frequency chirps to bounce off nearby structures.
A pitfall with this approach is you’re only going to be able to receive imagery similar to a grayscale medical ultrasound. Englot and his team believe that once a structure has been mapped out, a second pass by the robot could use a camera for a high-resolution image of critical areas.
For now, it’s early days but Englot’s project is an example of how AI is enabling a new era for robotics that improves efficiency while reducing the risks to humans.