Monitoring and analysis of wildlife are key to conservation planning and confict management. The widespread use of camera traps coupled with AIbased analysis tools serves as an excellent example of successful and non-invasive use of technology for design, planning, and evaluation of conservation policies. As opposed to the typical use of camera traps that capture still images or short videos, in this project, we propose to analyze longer term videos monitoring a large fock of birds. This project, which is part of the NSF-TIH Indo-US joint R&D partnership, focuses on solving challenges associated with the analysis of long-term videos captured at feeding grounds and nesting sites, among other such locations that host large focks of migratory birds. We foresee that the objectives of this project would lead to datasets and benchmarking tools as well as novel algorithms that would be instrumental in developing automated video analysis tools that could in turn help understand individual and social behavior of birds. The frst of the key outcomes of this research will include the curation of challenging, real-world datasets for benchmarking various image and video analytics algorithms for tasks such as counting, detection, segmentation, and tracking. Our recent efforts towards this outcome is a curated dataset of 812 high-resolution, point-annotated, images (4K - 32MP) of a fock of Demoiselle cranes (Anthropoides virgo) taken from their feeding site at Khichan, Rajasthan, India. The average number of birds in each image is about 207, with a maximum count of 1500. The benchmark experiments show that state-of-the-art vision techniques struggle with tasks such as segmentation, detection, localization, and density estimation for the proposed dataset. Over the execution of this open science research, we will be scaling this dataset for segmentation and tracking in videos, as well as developing novel techniques for video analytics for wildlife monitoring.
@article{kshitizlong, title={Long-term Monitoring of Bird Flocks in the Wild}, author={Kshitiz, Sonu Shreshtha and Mounir, Ramy and Vatsa, Mayank and Singh, Richa and Anand, Saket and Sarkar, Sudeep and Parihar, Sevaram Mali}, journal={International Joint Conference on Artificial Intelligence}, pages={6344--6352}, year={2023}, }
We thank Hemang Dahiya, Arsh Gupta, Neelabh Kumar Srivastava (IIIT-Delhi), and Ahmed Shahabaz (USF) for their assistance with point annotation for the proposed bird dataset, and the Bombay Natural History Society for the discussions. This research was supported by the US NSF grant IIS 1956050 and iHub Drishti, the TIH on CV, AR and VR. Saket Anand was partly supported by the Infosys Center for AI, IIIT-Delhi.
© This webpage was in part inspired from this template.