It’s no secret that deep learning and traditional computer vision techniques have revolutionized automated animal tracking, particularly across neuroscience, medicine, and biomechanics. The recent development of a UK-based research team only further proves this, introducing a remarkable hybrid method for precise tracking of fish movement in complex environments. This method employs both deep learning and classical vision techniques, allowing for robustness to changes in the fish’s appearance or occlusion by obstacles.
Combining deep learning’s adaptability with classical vision’s precision in centroid tracking, the team’s proposed technique utilizes adaptive object detection to accurately track the Picasso triggerfish amidst varying backgrounds, occlusions, or deformations. Utilizing a GoPro Hero 5 camera and advanced tools like EfficientDet and optical flow techniques, the method identifies gates and areas between obstacles where fish move, employing Voronoi cell methods.
The team’s paper outlines a pioneering method for analyzing Picasso triggerfish behavior via video processing in controlled tank settings. Moreover, it achieved a remarkable 97% alignment between computed and manual fish trajectories, encouraging potential adaptations and advancements in automated animal tracking. To this end, the researchers released their software, dataset, and tutorial under a Creative Commons license, making it available to the broader scientific community.
This innovative fusion of deep learning and traditional computer vision techniques is a vital step in advancing animal tracking accuracy, particularly for fish in complex experimental setups. While achieving impressive results, challenges remain, urging further refinement for broader applications beyond controlled settings.
In conclusion, the development of this remarkable software for precision tracking of fish movement in complex environments is a major breakthrough in automated animal tracking, providing crucial resources for potential adaptations and advancements in the field.