Robotic Choreography

Originally Posted on July 19th, 2010 by Alex Forrest


This blog is courtesy of Pavilion Lake Research Project (PLRP)

For more information please visit


DORA and UBC-Gavia in the water ready to deploy in Pavilion Lake.

Its now been just over a week since the end of our adventures at Pavilion Lake and, as I start trying to look at all the data we’ve collected, I can’t help but be impressed with our successes. In addition to the image mosaicing that I was working on, and showed pictures of in an earlier post, my specific focus of being up at the lake was running coordinated missions between the two autonomous underwater vehicles (AUVs), that we had on-site from the University of British Columbia and the University of Delaware, and the Deepworker vehicles. Our mission planning goals were twofold; joint objectives and joint missions.

Joint objective style missions measure parameters that are relatively static in time (i.e. photos of microbialites). This means that coordinating different platforms isn’t necessary but coordinating their datasets are. This requires that the timestamps of each data stream be precisely set and that the dataset is georeferenced to a high degree of accuracy. This work was started last year but continued this year by using the collected images from Deepworker and comparing it with AUV collected data (e.g. high-precision bathymetry).

Comparing multibeam bathymetry collected with DORA with detailed imagery from UBC-Gavia.

Joint missions involved a significantly greater degree of coordination as it involved running the vehicles at the same time as the Deepworkers. Our experiment this year was to look at the area of increased salinity at the bottom of the lake. To this end had the Deepworkers crossing the bottom of the basin at about 1 m from the bottom (> 55 m depth), while running UBC-Gavia at 40 m depth. The greatest debate was trying to decide what the minimum safe distance was to be between the two platforms! In the end we ran AUV missions down to 48 m without any problems. Although we’re just starting to process all of this data now, from both styles of missions, we’re excited about what new perspectives these combined datasets might hold.