Researchers at the University of Barcelona have developed an open-access web app, based on deep learning techniques, which enables the detection and quantification of floating marine litter from aerial photography.
Researchers from the University of Barcelona have developed an open-access web app – ‘MARLIT’ – which enables the detection and quantification of floating plastic with a reliability of 80 per cent. The app is based on an algorithm designed using deep learning techniques.
Historically, direct observations from boats or planes have been the basis for assessing the impact of floating marine litter. However, the sheer great ocean area makes it challenging for researchers to advance with monitoring studies.
“Automatic aerial photography techniques combined with analytical algorithms are more efficient protocols for the control and study of this kind of pollutant,” said Odei Garcia-Garin, PhD candidate and first author of the Environmental Pollution study. “However, automated remote sensing of these materials is at an early stage.
“There are several factors in the ocean [waves, wind, and clouds] that harden the detection of floating litter automatically with the aerial images of the marine surface. This is why there are only a few studies that made the effort to work on algorithms to apply to this new research context.”
The researchers designed a new algorithm to automate the quantification of floating plastic from aerial photography, using deep learning techniques and more than 38,000 aerial images of the Mediterranean coast in Catalonia. Deep learning is an approach to marine learning often used in computer vision applications.
The algorithm was tested using images of the marine surface collected by drones and planes and reached 80 per cent accuracy.
The algorithm has been incorporated into MARLIT, an open-access web app. MARLIT enables the analysis of individual images, as well as analysis of segments of images, identifying the presence of floating litter and estimating their density using image metadata. In future, it could be incorporated into a remote sensor such as a drone to fully automate the process.