We’ve sent people to the Moon and we’ve put rovers on Mars (most people believe this unless they’re a conspiracy nut or a flat-Earther), but the oceans on our very own planet are still very much unexplored. And photographers are missing out on the beauty and biological diversity that await us in the Seven Seas because all that water gets in the way and distorts the colors of everything—from seahorses to corals.
Well, oceanographer Derya Akkaynak and engineer Tali Treibitz have created an algorithm called ‘Sea-thru’ (I’ll bet my right hand that this pun is definitely intended) that removes the water from underwater photographs. That way, you get all the hues and saturation that you’d expect from regular, professional landscape pics. Scroll down for Bored Panda's interviews with Akkaynak, as well as with Alastair Bailey, who has done around half a thousand dives, about underwater photography.
In an interview with Akkaynak, here is what she had to say to Bored Panda. “Underwater images have not benefited from the full potential of powerful computer vision and machine learning algorithms, which have enabled great discoveries (and fun tools) for photos taken on land. Many marine scientists use photography as a way to study the ocean ecosystems and collect vast amounts of images and video, but only a fraction of those ever get analyzed because for the most part, they have to do it manually.”
“For example, according to a NOAA report, a human expert spends 2 hours identifying and counting fish in a video that is one hour long. That is tedious and expensive—not to mention slow, at a time scientists are racing against time to understand and monitor the dramatic changes our oceans are undergoing.”
She continued: “We set out to help speed up the pace of marine research, by providing a fast and consistent way to recover lost colors in underwater images, so powerful AI methods could be used.”
“Sea-thru is the product of 3.5 years of theoretical and experimental research. Everything that can go wrong for a scientific project has gone wrong, and more. There were many failed underwater experiments, which took a long time to even understand what exactly was failing. In the first year of it, I got hit by a car and broke my hand two weeks before a very important deadline. I still did all the planned dives by the schedule—with one operational hand,” Akkaynak revealed how big her dedication to the project was.
She also explained where the research will be taken next. “The first and immediate step is to release a version that will be freely available for non-commercial uses. Both scientists and recreational photographers have made it clear to us that they want their own Sea-thru copy.”
“In terms of where to take the research next, we would like to adapt the algorithm for artificial light (currently it is designed to work with natural light only). Even in the clearest waters, natural light penetrates just a few dozen meters, so we use artificial lights to explore the vast majority of our oceans. And in terms of applications, we definitely see it as a filter in Photoshop (or other software) for post-processing of photos, but I imagine it’s only a matter of time (and funds) until we will see real-time implementations on cameras or phones or even scuba masks!”
Akkaynak also drew attention to this fact: “I would like to emphasize that Sea-thru does not require a color chart to work. That point was not clear from the Scientific American video and created a lot of confusion.”
What you need to know about water, besides that it’s wet and covers over two-thirds of the planet’s surface is that it scatters light at different wavelengths. This leads to underwater coral reef photographers capturing washed out images without the vibrant yellows and reds that you’d otherwise expect to sea.
Bored Panda also reached out to DSD Divemaster Alastair Bailey who has done around 500 dives while working at Byron Bay Dive Centre. In an interview, he explained that underwater “you get light refraction making objects appear closer and larger, however after a couple of dives you start to account for this naturally.”
“The distance you can see is also affected by the water and any particles suspended in the water. Then as you go deeper light gets absorbed this starts with red.”
Bailey explained that the biggest challenges that underwater photographers face is the lack of visibility, as well as the absorption of light. “These work to reduce the clarity and contrast in photos. While you do get red filters these generally work only between certain depths, the second option is using high powered flashbulbs.”
Akkaynak and Treibitz’s algorithm has far-reaching consequences for marine biologists who rely on accurate colors to count and classify species: now they’ll get a well-deserved break because machines will be able to do their time-consuming work for them.
While some people are incredibly excited by the scientists’ algorithm, others are less than enthusiastic. Some critics believe that removing the water from underwater photos makes them look bland. While others expressed the opinion that the algorithm is far from revolutionary because it involves ‘basic color correction.’
However, the ‘Sea-thru’ algorithm is very different from Photoshopping a photo to enhance the colors that water absorbs. The algorithm creates a “physically accurate correction,” which means that what you see is what you’d get if underwater plants and animals were on the surface.
But what do you think of the underwater algorithm, dear Pandas? Which photos do you prefer: the pics with the water or those with all the water removed? Let us know in the comments.