FlexiGroBots - People detection, location, and tracking
This application enhances agricultural safety by facilitating collaboration between human operators and autonomous robots on the same land sections simultaneously. Addressing the increasing need for robotic solutions in agriculture due to labor shortages and economic efficiency, it focuses on adding safety layers and enhancing robot perception of their surroundings.
This tool is a component of the European project, Flexigrobots. It consists of various models that together provide services for detecting, locating, and tracking individuals in agricultural environments. Users can utilize this service through Docker technology, accessible in a public GitHub repository associated with this application.
The final tool implementation, part of a sophisticated computer vision project, offers several key functionalities. It accurately identifies and segments people, vehicles, and other elements using advanced techniques like YOLOv8-COCO or Detic. The tool is capable of tracking these detected objects over time through methods such as Zero-shot tracking or ByteTrack. It also estimates the distance of objects from a monocular camera using DPT, enhanced by a new color map for clearer depth interpretation. Smooth depth estimation in video sequences is achieved with a Kalman filter, and the tool alerts users when objects come too close to the camera. It even visually highlights the nearest person or object for quick identification. These functionalities are complemented by real-time alerts to the Mission Control Center, crucial for decision-making in controlled environments.
Technically, the tool requires a high-resolution monocular camera, a powerful GPU, and significant memory and disk space. It operates effectively on an Ubuntu 22.04 system with Python 3.10. The functional requirements, focused on real-time processing and high accuracy, remain consistent with previous project specifications.