IA4Birds incorporates an AI-based bird visual detection system
In parallel with advances in the AI-based audio classification model, researchers from the IA4Birds project have developed two complementary visual tracking algorithms using the advanced AXIS Q6225-LE PTZ (Pan-Tilt-Zoom) camera.
The first algorithm integrates custom visual detection models based on YOLO (You Only Look Once) technology for real-time identification and continuous tracking of individual birds in flight. The system detects birds, calculates their spatial position in each frame, and automatically adjusts the camera’s position to keep the selected bird centred in view. To prevent constant switching between different detected birds, the algorithm includes a maximum distance threshold between the current position and the previous detection, ensuring stable and uninterrupted tracking.
The second algorithm is designed for population and behavioural analysis through the automated processing of segmented video footage, generating structured data stored in JSON format. This algorithm uses the YOLO model to detect birds in each frame and tracks individuals using advanced identity assignment techniques. From this data, the system calculates key metrics such as the average area occupied by the birds, the total number of individuals detected, and precise spatial information including distance, absolute azimuth, and colatitude. These data are then used to produce sky heat maps, collages representing the spatial density of birds, and vector diagrams showing predominant flight directions.
The IA4Birds project is an initiative in which AIR Institute is supported by the Biodiversity Foundation of the Ministry for Ecological Transition and Demographic Challenge (MITECO), in the framework of the Recovery, Transformation and Resilience Plan (PRTR) funded by the European Union - Next Generation EU.