Untethered Vehicle Tech Continues to Advance Autonomy
By Justin Manley
Kongsberg Maritime offered two presentations: “How Autonomous is Your AUV” presented by Richard Mills and “Autonomous Technology for Ocean Exploration,” by Arnt Olsen. Together these presentations used the context of Kongsberg’s vehicles, especially the Hugin AUV and the Sounder USV, to discuss what autonomous technology means, and is capable of. In the case of the USV there is a clear spectrum of operator control. This has four stages: 1) it begins at line of sight remote control in which an operator uses a hand held remote control for docking and harbor maneuvering; 2) it moves on to remote control beyond line of sight, where an onshore operator controls the voyage and makes changes to speed, heading, and other onboard systems; 3) in supervised mode the USV will follow a pre-set mission plan which the operator may change during the mission; 4) the final stage is fully autonomous where the USV will solve the tasks set in the mission plan, make decisions by itself, and avoid collisions. While this spectrum may be readily apparent to practicing UUV users, it provides a useful construct to discuss the underlying technologies that enable a USV.
Key elements to successful USV operations include: direct vehicle control and fleet management, a USV world model, and image recognition. The first is perhaps most obvious and a relatively direct transition from crewed vessel operations. But the second and third elements are quite sophisticated. A USV world model draws upon a three-dimensional world model, geo-referencing, image recognition, sensor fusion, and overall evaluation of this model via machine learning and artificial intelligence techniques.
Kongsberg’s AUV presentation also addressed these ideas. In that case they have a similar spectrum of autonomy. It begins with mission autonomy, getting from A to B, following a route/mission plan, and managing sensors. The next level is advanced autonomy including reactive behavior, collision avoidance, and mission tasks such as following a pipe. The highest level of function is adaptive autonomy for actions such as proactive behaviors, on-board decision making, and adjusting mission outcomes. This culminates in a perspective of nested capabilities, including deep learning, machine learning, and true artificial intelligence.
EIVA’s Antonio Felipe Siva presented “Visually Aided Navigation and Automated Subsea Inspection Using Deep Learning and Computer Vision” a briefing on how their software employs machine learning. In particular they use it for advanced subsea feature tracking. They recently released a paper describing a new feature detection algorithm that outperforms current algorithms in the market, which will be used in their software tools. The software tools can generate a three-dimensional mesh from any point cloud or sonar, and from analyzed video data. The end product is a “3D mesh” with the “full resolution of the image” even though the underlying point-cloud has been reduced.
Another presentation of novel ideas came from Terradepth’s Ken Childress. This new startup in Austin, Texas is pioneering a networked systems approach for ocean data collection. Their approach aims to extend mission duration with high power payload capacity while limiting offshore personnel through autonomous data collection and quality assurance. They also aim to reduce staffing onshore with autonomous data processing.
A key in this vision is efficiency with scale of operations and reliability from many interchangeable vehicles, which they term AxVs.
The Terradepth Concept of Operations (CONOPS) includes: 1) surface and subsurface capability, 2) two or more identical vehicles, 3) launch from shore or sea, 4) open ocean transit to survey site, 5) extended time at sea, 6) multi-vehicle cooperation, and 7) Precision data gathering. This is executed via a single configuration of hybrid autonomous marine vehicle surface and subsurface system. Terradepth has a patent pending for this CONOPS and is currently preparing for initial demonstrations in mid-2020. In this concept the individual AxVs employ autonomy concepts similar to the AUV and USV ideas discussed by Kongsberg. But the overall network also offers opportunities for new approaches to fleet management and individual AxV tasking.
While the Terradepth ideas are based on new vehicle designs and Kongsberg focused on discussing the software/autonomy elements of unmanned maritime systems, another new entrant addressed both. Stone Aerospace is also based in Austin, Texas. After many years of specialty research and development projects, most funded by NASA, they introduced their first commercial concept, Sunfish, in a briefing by Kristof Richmond, “Demonstrations of a human-portable hovering AUV for complex 3D environments.” This addressed both a new AUV design and a powerful software offering.
The vehicle itself is described as able to fly like a jet and hover like a helicopter. It is a person-portable system that addresses many challenges of autonomous exploration in complex environments. The AUV is designed to operate in a wide variety of undersea environments, ranging from man-made, such as piers or harbors, to natural, reefs or caves. The core vehicle is 1.6 meters long, 0.47 meters wide and only 0.2 meters high. At 50 kilograms in weight the system is highly portable. SUNFISH can stop in mid “flight,” hover, and perform proximity operations along objects it has mapped and do so at specified stand-off ranges while acquiring high-resolution photos and geometry. This precision, combined with sophisticated artificial intelligence, makes SUNFISH ideal for complex and combined environments.
During operations SUNFISH can report back, using an optional data-only tether, allowing a surface-based team to review, and re-task the vehicle. The system can also operate fully autonomously. While the mechanical design of the AUV is interesting, it is the powerful software onboard that is truly innovative. This vehicle employs field proven Artificial Intelligence (AI) and Simultaneous Localization and Mapping (SLAM).
This capability has been used in a series of tests and demonstrations in unstructured labyrinthine cave environments. SUNFISH was able to autonomously explore this environment, creating a real-time map which it used to navigate through the cave. SUNFISH has explored numerous caves, making a map as it proceeds. The robot was then able to autonomously find its way back to the starting point using the map it had just created.
Throughout these presentations the UUV Track served to showcase the growing power of software in new un-crewed marine vehicles. There remains a great deal of room to discuss and define terms such as autonomy, artificial intelligence, machine learning, SLAM, and other terms of art. But regardless of the definitions the new capabilities coming to all manner of ocean robots are impressive. New hardware and software ideas are driving innovation and shaping the future of untethered vehicles in the maritime industry.