Advancing underwater robotics

15 June 2010



A new project has been researching the use of advanced underwater robotics technology for dam inspections. Members of the research team from the University of Girona Spain give more details


AIRSUB is a research project funded by the Spanish Ministry of Science and Technology with the aim of exploring the use of advanced underwater robotics technology for the visual inspection of hydroelectric dams. Two particular application scenarios have been studied: the bottom floor inspection for habitat mapping purposes of the zebra mussel; and the visual inspection of the concrete of the dam wall.

In both cases the same technology is used. An ultra short base line (USBL) system connected with a differential global positioning system (DGPS) and a motion reference unit (MRU), mounted on a moored buoy (or alternatively on a support boat), are used for tracking and geo-referencing the robot position. The vehicle is equipped with a doppler velocity log (DVL) sensor and a MRU used for onboard navigation.

Two video cameras (one looking forward and one looking downward) are used for visual inspection and mechanical scanning imaging sonar is used for helping the pilot when the robot is operated manually. The entire data gathered onboard, as well as the data gathered on the buoy, is time-synchronised. At the end of the mission both the navigation data and the images are post-processed to build a geo-referenced photo-mosaic. Navigation data is exploited to provide a consistent, globally-aligned optical map, reducing the inherent drift as much as possible if only optimal imagery were used.

The results in two different scenarios are illustrated in this paper. In the first case, the robot was tele-operated while in the second it was operating autonomously: A geo-referenced bottom mosaic gathered in Mequinenza dam on the Ebro river in Spain was used for habitat mapping of the zebra mussel; and a geo-referenced photo-mosaic of the wall of the Pasteral dam in Girona (on the Ter river in Spain).

Underwater robotics

The AIRSUB research project was devoted to the research and development of advanced underwater robotics and image processing techniques with applications to dam inspection. During the project, the research team met with the civil engineers of a Spanish power generation company, as well as with biologists of an environmental monitoring company. The aim was to define potential application scenarios of both industrial and environmental monitoring interests. The following cases were considered:

• Visual survey of the dam wall to assess the state of the concrete.

• Visual survey of the protecting fence of the water inlet to the penstock gallery to assess the quantity of vegetal residuals obstructing the water flow.

• Visual survey of the bottom for habitat mapping of the Zebra mussel.

Currently, these inspections are commonly performed through a careful visualisation of a video recorded by a professional diver, often without any localisation information. Sometimes a GPS reading gathered at the surface is overlaid on the image introducing an approximate location of the underwater camera. A diver tracking system may also be used, but even in this case the system does not have enough accuracy to stitch all the images together to obtain a global map of the surveyed area.

Over the past years, several companies have offered underwater robots for dam inspection. Normally they propose the use of small class remotely operated vehicles (ROV), working as tele-operated cameras for video recording, to replace the professional diver who traditionally carried out this task. There exist few research precedents providing an added value solution.

One of the most relevant works is the ROV3 system developed by the researchers of the Institut de Recherche Hydro Québec in Canada [1]. This is a small ROV, localised through a long base line (LBL) system, which makes use of multi-beam sonar for collision avoidance. The system is able to control the distance to the wall and includes several video cameras as well as a laser system for 2D and 3D measurements.

COMEX and Electricité De France developed a similar project [2]. In this case, an ROV manufactured by COMEX was localised using five LBL transponders. Again, several video cameras together with a 2D (double spot) laser system was used to take measurements.

During 2002, in collaboration with the Research Development and Technological Transfer Centre (CIFATT) IPA-Cluj, the research team used the URIS robot working as an ROV to build an image mosaic of a small area of the wall of the Tartina dam in Romania [3]. This was the first time that image mosaic techniques have been applied for dam inspection. This solution gives important added value since it provides civil engineers with a global view of the inspected area.

Unfortunately, the ROV was not localised and hence the resulting image mosaic was not geo-referenced. The lack of this information makes it more difficult to perform periodic inspections on damaged spots and also to locate the areas where repair works must take place. The major contribution of the work presented in this article consist of using an underwater robot and an USBL-based global localisation system for building geo-referenced photo-mosaics of selected areas of the wall or the bottom floor, opening the door to systematic visual inspections.

Hardware systems

During the AIRSUB project different hardware systems have been developed to allow the generation of geo-referenced visual maps. The first is an underwater vehicle capable of executing the survey mission and the second, an absolute positioning system capable of determining its position in world coordinates. Each one of these elements is described below.

The Ictineu vehicle

The Ictineu vehicle [4] was conceived around a typical open frame design as a research prototype for validating new technologies. It is a small (0.8x0.5x0.5m) and light (60kg in air) vehicle. It is practically neutral (approx. 0.6 Kg of positive buoyancy), it is stable in roll and pitch due to the weights and volumes distribution, and it can be controlled in surge, sway, heave and yaw with six thrusters. Four of them are placed horizontally in a rhombus configuration that makes it possible to thrust in any horizontal direction simultaneously (surge and sway) and perform rotation (yaw). The other two thrusters are placed vertically and can actuate the heave degrees of freedom (DOF).

The four DOFs allow the scanning of the wall of the dam maintaining the distance and point of view of the camera while moving vertically and horizontally. The vehicle has two cylindrical pressure vessels which house the power and computer modules. The power module contains a pack of batteries and the computer module has two industrial PCs, one for control and one for image and sonar processing connected through a 100 Mbps Ethernet switch.

An interesting characteristic of this vehicle is that it can operate either as an ROV (tethered mode) or as an AUV (untethered mode). An optional umbilical cable can be connected to the two modules to supply power and ethernet communication to the vehicle. This mode of operation is very useful, not only to operate the Ictineu as a ROV, but to monitor the software architecture while the vehicle is performing the dam inspection autonomously.

When working in full AUV mode, the umbilical cable is removed and the vehicle relies on batteries to power all the systems and, therefore, has a limited running time but a longer range of operation. Communication can then be established using an acoustic modem, which is integrated in the USBL sensor. The sensors onboard the Ictineu AUV are listed on Table 1.

USBL absolute positioning system

The purpose of this surface equipment is to determine the absolute position in world coordinates of the USBL transponder mounted on the vehicle. This information is necessary for geo-referencing the sensor data acquired with the vehicle, as well as to reduce the drift that inherently affects the dead-reckoning navigation estimate.

This system is basically composed of a Linkquest Tracklink 1500 USBL transceiver and its supporting sensors, a DGPS and an Xsens MTi MRU, whose objective is to compensate the position and attitude changes of the transceiver during the vehicle position estimation process (see figure 2a). The different components are attached to an aluminium structure which, depending on the requirements of the mission, can be mounted outboard of a small boat (see figure 2b) or attached to a drifting buoy (see figure 2c).

The data logging is performed on an external computer connected to the sensors through RS232 that, if necessary, can be mounted in the buoy and powered with batteries. In order to integrate the sensor information acquired with the Ictineu with the position estimates from the USBL system, the data should have a common time base. For this reason, the computers in charge of the data logging should be synchronised to the beginning of the mission.

Software systems

In this section, several software systems designed and developed in our lab for controlling and geo-referencing the robot position, and building visual maps, are described.

Intelligent control architecture

Ictineu AUV is endowed with intelligent software responsible for the navigation, guidance and control of the vehicle. The mission is programmed using mission control language (MCL), an imperative language which compiles a Petri-net based representation of the mission [5], useful for pre-mission verification and real-time mission execution. During the run time, the mission controller is in charge of the sequential and/or parallel execution of tasks.

The system includes tasks for checking safety alarms, logging sensor data, waypoint guidance, path following, sensors enabling/disabling, keeping a relative position with respect to the wall [6], altitude control, etc. The result of the mission execution is a set of log files containing all the data gathered by the sensors, including the imagery, conveniently synchronised. This data is then post-processed to build the maps.

Geo-referencing the robot position

During a mission, the vehicle estimates the trajectory by means of a stochastic sensor fusion algorithm known as the extended Kalman filter [7]. The filter estimates the vehicle state (position, heading and velocity) and its corresponding uncertainty following a two-step recursive process. First, a model is used to predict the vehicle motion. Then, this predicted state is corrected by introducing new information provided by the sensors.

In our particular application, the velocity measurements from the DVL, the depth calculated from the pressure sensor and the heading provided by the fibre optic gyro are used to perform the state update. As a result of this process, a dead-reckoning estimation of the vehicle motion is obtained. The USBL system provides absolute measurements required to geo-reference the vehicle as well as to bind the error growth in the estimated position.

To operate, the USBL system makes use of a second Kalman filter that constantly estimates the position and altitude of the transducer through the measurements provided by the MRU and the DGPS. When a USBL measurement is obtained (vehicle position referenced to the sensor frame) it is composed with the current filter estimate (absolute position and altitude of the sensor) to produce the vehicle position referenced to the world frame. This global position is then fused offline with the dead-reckoning position estimate, to compute the geo-referenced robot trajectory to assist the mosaic building process.

Photo-mosaics

Photo-mosaics are built offline by stitching together all the images gathered during the survey and making use of the navigation data (robot pose). The process is done automatically using software developed by our team. First of all the algorithm processes each pair of sequential overlapping images common features. This allows computation (up to scale) of the camera motion allowing alignment of the images forming a global visual map (of the photo-mosaic).

It is worth noting that because this is an iterative process, the small registration errors are accumulative, provoking a drift. Whenever the robot re-visits an already mapped area (crossover), a loop is generated. This allows a non-consecutive image registration to be used for globally aligning the photo-mosaic through a technique known as bundle adjustment. Finally, the mosaic alignment is improved through several iterations of crossover detection and optimisation. The result is a globally aligned photo-mosaic.

Experimental results

In July 2008 several experiments were carried out upstream of the Mequinenza dam on the Ebro river in Spain. This was part of a collaboration project with Ecohydros, a Spanish company that offers aquatic ecosystems consulting services. The purpose of the experiments was to produce several geo-referenced photo-mosaics of the bottom. This was to provide visual validation for a sonar-based system being developed by the company to detect zebra mussel colonies.

During the test trials, the Ictineu vehicle together with the USBL system was mounted outboard of a boat and performed small surveys in some areas of interest. The water turbidity forced navigation close to the bottom to ensure sufficient image quality. This made it difficult to control the vehicle and reduced the performance of the DVL measurements (our device is unable to provide reliable velocity estimates under 1m altitude). Moreover, the harsh environment, with big slopes and large rocks, made the autonomous operation of the vehicle impossible. For this reason, the Ictineu was set to ROV mode and operated from the boat with the help of the real time feedback from the imaging sonar, the vehicle cameras and the position estimate from the USBL.

The results of the different dives were irregular. The unreliable DVL data and the multipath affecting the USBL measurements made it difficult to obtain a good position estimate. The quality of the captured images was also irregular. It was necessary to perform manoeuvres to avoid hitting large rocks present on the scenario and as a consequence, the camera suffered abrupt changes in the altitude, sometimes losing sight of the bottom because of the turbid waters. However, despite all the commented issues, it was still possible to generate some photo-mosaics of sufficient quality.

Experiments at the Pasteral Dam

The purpose of this test was to demonstrate the capacity of the system to execute an inspection of a dam wall to search for cracks or other damages on the concrete. Nowadays, professional divers perform this task, but they offer only a limited view of the submerged structure, which makes it difficult to clearly determine the position of the defects.

The proposed method overcomes this problem by generating a geo-referenced mosaic from which a technician can easily determine the exact location of the spot of interest. Moreover, the fact that the underwater vehicle can execute this task autonomously makes it possible to increase the frequency of the inspections.

During the tests carried out in February 2009 at the Pasteral Dam (Ter river, Spain) the Ictineu executed autonomously a survey trajectory covering a rectangular area of approximately 4x10m with the forward looking camera. The USBL system was placed on a buoy and anchored in front of the dam with the transducer oriented to the wall. With this same setup it is possible to cover larger areas. Actually, the same image processing technology has been successfully applied for building a photo-mosaic of 20000 images of the Mid Atlantic Ridge [8], in this case gathered from an Oceanographic ROV.

However, during the trials the hydroelectric dam was operational and the test area was limited for safety purposes. In the resulting photomosaic it is possible to observe the concrete texture as well as the algae adhered to it. Six reflective elements were placed at known distances as a form of validating that the resulting mosaic dimensions were adequate.

It is our belief that performing systematic geo-referenced visual inspections provides important added value with respect to conventional methods of dam inspections.

The authors are P. Ridao, M. Carreras, R. Garcia, D. Ribas and J. Batlle at the Institute of Informatics and Applications, University of Girona, Spain.

Email:{pere, marcc, rafa, dribas, jbatlle}@eia.udg.edu

Acknowledgements: This research was sponsored by the Spanish government under the grants DPI2005-09001-C03-01 (Autonomous Robot for Dam Inspection) and PROFIT 010/SGTB/2007/1.1 (Hydraulic and Remote Sensing Techniques for Zebra Mussel Management), in an activity executed by Ecohydros under the coordination of the project leader DMADS ENDESA. We would like to thanks also the ENDESA Generación Spanish Company for their help in the experiments in the Pasteral dam.


Tables

Table 1

Figure 4 Figure 4
Figure 2a Figure 2a
Figure 1 Figure 1
Figure 2c Figure 2c
Figure 2b Figure 2b
Figure 3 Figure 3
Figure 5 Figure 5


Privacy Policy
We have updated our privacy policy. In the latest update it explains what cookies are and how we use them on our site. To learn more about cookies and their benefits, please view our privacy policy. Please be aware that parts of this site will not function correctly if you disable cookies. By continuing to use this site, you consent to our use of cookies in accordance with our privacy policy unless you have disabled them.