Exploring Embedded Systems through building an Autonomous Drone


To learn the basics of embedded systems, I am currently working on a team of five undergraduates programming a Parrot AR Drone 2.0. The drone itself is not inherently programmable and has very limited resources. Therefore, my team is interfacing the drone with a controlling computer, allowing us to retrieve data in real-time, process it, and instruct the drone accordingly. The ultimate goal is to provide visual commands to the drone’s onboard camera to properly control its actions. We must develop several hardware/software components to make this project successful including:

  1. A mounted Arduino microprocessor that gathers ultrasonic distance data of the drone’s surrounding environment. This data is then transferred to a secondary computer via Bluetooth communication.
  2. A software component that must filter and interpret the ultrasonic data to give meaningful proprioception information to the drone.
  3. A Node.js program that interfaces with drone over Wifi using the TCP/IP networking and the HTTP protocol for sending messages. This loop continually gives commands to the drone and retrieves the camera feed.
  4. A computer vision component that uses OpenCV to match predefined command templates to the real-time camera feed.

Overall, this is a very exciting project, because of the number of technologies that must be developed and interfaced in parallel such as Bluetooth communication, Arduino embedded systems, Node.js applications, and OpenCV libraries. Our team’s website can be found here:



Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s