October 19, 2013
The Ocean Through The Eyes Of A Robot
Some of the greatest mysteries of our planet Earth can be found just below the waves. Much about our oceans remains unknown due to the difficulties of being a land-based animal ourselves. The ocean is foreign to us, despite how much we rely upon it for survival. One such mystery, or many, lies in our coral reefs, as we have never accurately been able to fully map them out to discover just how expansive they are, and thus find what they may hide within. Most of our modern tools lack the ability to do so reliably. Satellite images from space tend to be too distorted when looking through water, sonar does not work well in the shallow waters where we find such reefs, and radar cannot penetrate the water’s surface while hand-drawn maps, basic underwater photography, and mapping out quadrants takes time. More time than many researchers feel they have left. Given the reefs vulnerability to pollution, and the ever increasing amount of pollution that makes its way into our oceans each and every year, researchers are fearing that they may be running out of time to study these reefs, all the while desperately trying to find a way to save them.
And that is where robots come in.
Developed by Stanford aeronautics graduate student Ved Chirayath, a four-rotor remote-controlled flying robot outfitted with cameras has been used to record images of the reefs from up to 200 feet in the air. Using these images, linked up with a number of 360-degree underwater cameras and using software designed by Chirayath during his work at NASA’s Ames Research Center (link not available due to government shutdown), called Fluid Lensing, Chirayath is able to compile detailed centimeter-scale optical areal maps and gigapixel panoramic photographs of the reefs. His software removes distortion caused by wave movements and enhances the resolution of the images the robot captures, thus allowing the various images taken by the flying drone and the underwater camera to be compiled into singular images that give researchers a more detailed look at coral reefs than they have ever had before.
Chirayath developed Fluid Lensing to be able to accurately record images of vegetation and oceanic flows from satellites, as well as being able to get detailed information on targets out in space. It was only when he heard about the difficulties of mapping out coral reefs that he turned his attention, and his software, toward that end. Chirayath claims that he was “inspired by the way the human eye works in conjunction with the brain to try to resolve an obscured image” for how he came up with the idea that led to the Fluid Lensing software. “It’s an ability to rapidly assimilate a vast amount of data and, in effect, see through strong optical distortions.” So, in effect, the eyes of his flying drone work much in the same way our human eyes do.
How will this new technology help us save what remains of our fragile coral reefs? Only time will be able to tell us that.
Image Credit: Dan Griffin / Stanford University