Camera equipped drones are fun and enjoyable toys in private environments and for personal video shootings; but particularly quad-copters have recently also turned into annoyances in local environments; or even to severe (life-) threats in combat zones. It is a technical challenge (combined algorithmic and hardware) to reliably detect drones without delays in local environments and possibly later larger spaces.
In this project, we will employ event-based vision cameras (ref 1, 2) to detect the unique spatio-temporal visual signals that drone propellers produce (spinning at high frequencies). The students will design event-based processing algorithms to pick-up those signal from the background, and program a mobile event camera (pan-tilt-yaw gimbal head) to follow the identified object. In the research lab we have access to event-cameras and to toy drones that can be used for prototype development and benchmarking in this project.
[Picture]
Stereo Drone Detecting Cameras
Tasks
- Understand the (processing) differences between video and event cameras in a high-speed computer-vision task; select and implement a useful vision processing algorithm to detect spinning propellers.
- Control a camera pan-tilt unit to follow an approaching drone in a closed-loop setting.
- Evaluate the performance of the vision task
Process
- Familiar yourself with the existing camera (event camera with pan-tilt-yaw gimbal).
- Read existing literature on event cameras and processing algorithms, identify useful tracking principles and existing algorithm.
- Implement the computer vision algorithm, evaluate performance in closed-loop setting.
- Write report, prepare live demonstration / video.
Expected Outcomes
The pan-tilt-yaw gimbal with event-camera is setup and configured. We will first perform a few data recordings while hand-operating drones. Then students select event-based vision processing algorithms for implementation; and evaluate the algorithms on the collected data. When successful, students can run the algorithms in closed-loop on the pan-tilt gimbal camera head to tack a flying drone in real time. An evaluation about the latency and the precision of the vision algorithms will conclude the project.
The pan-tilt-yaw gimbal with event-camera is setup and configured. We will first perform a few data recordings while hand-operating drones. Then students select event-based vision processing algorithms for implementation; and evaluate the algorithms on the collected data. When successful, students can run the algorithms in closed-loop on the pan-tilt gimbal camera head to tack a flying drone in real time. An evaluation about the latency and the precision of the vision algorithms will conclude the project.
Prerequisites
Programming in C (high-speed) and python, some experience with computer vision.
Programming in C (high-speed) and python, some experience with computer vision.
- https://www.linkedin.com/pulse/drone-detection-using-event-based-camera-bluvec/
- G. Gallego, et al., "Event-Based Vision: A Survey" in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 44, no. 01, pp. 154-180, 2022.
doi: 10.1109/TPAMI.2020.3008413
https://www.computer.org/csdl/journal/tp/2022/01/09138762/1llK3L5znva