To learn the basics of embedded systems, I am currently working on a team of five undergraduates programming a Parrot AR Drone 2.0. The drone itself is not inherently programmable and has very limited resources. Therefore, my team is interfacing the drone with a controlling computer, allowing us to retrieve data in real-time, process it, and instruct the drone accordingly. The ultimate goal is to provide visual commands to the drone’s onboard camera to properly control its actions. We must develop several hardware/software components to make this project successful including:
- A mounted Arduino microprocessor that gathers ultrasonic distance data of the drone’s surrounding environment. This data is then transferred to a secondary computer via Bluetooth communication.
- A software component that must filter and interpret the ultrasonic data to give meaningful proprioception information to the drone.
- A Node.js program that interfaces with drone over Wifi using the TCP/IP networking and the HTTP protocol for sending messages. This loop continually gives commands to the drone and retrieves the camera feed.
- A computer vision component that uses OpenCV to match predefined command templates to the real-time camera feed.
Overall, this is a very exciting project, because of the number of technologies that must be developed and interfaced in parallel such as Bluetooth communication, Arduino embedded systems, Node.js applications, and OpenCV libraries. Our team’s website can be found here: