Gesture Aerial Camera: Senior Project

About

For my Computer Engineering Senior project, I decided to construct a quadcopter with a gesture-based control scheme. As the project became more defined, I described the idea as trying to create a dynamic camera option using a camera-mounted quadcopter with intuitive motion controls. What was produced was a decent prototype and a strong step in the direction of a full product.

My Contributions

Specifications and Documentations

Before any physical work was done, a specifications document and schedule were created. Next, was the design, based on the specifications and presented in a Design Review.

A Custom Copter

The first stage of the physical side of my project was to gather the components I would need to build a quadcopter that could not only fly, but be able to receive the commands from the device that would read the user's gestures. I spent several weeks studying various motors, propellers, frames, electronic speed controllers, and flight controllers. Once selected, those components were combined with a Raspberry Pi Zero-W, using UART serial lines, to create a Wi-Fi enabled quadcopter device.

A Gesture Armband

The project consisted of three major parts: A commercial, gesture sensing armband, a phone control app, and the quadcopter itself. During the conceptualization phase of the project, I was directed toward Myo, an electromyographic armband, which used its muscle sensors to detect when you're making certain handshapes. Integration into the project turned out to be quite simple, as it had a well written API and access to an Android Intent for pairing.

The Phone App

By pairing the Myo armband with the user's phone, I already had a convenient way to interpret gestures to turn them into commands. The next task was to then transmit those commands to the quadcopter. It was decided that Wi-Fi Direct would be used, as it had a much greater range than Bluetooth, to connect the phone to the quadcopter. The app was then created with a screen providing a list of all the detected Wi-Fi Direct devices nearby, and would attempt to request a connection to the one selected by the user. Once paired, it would transfer the user to a control screen for the quadcopter, overlayed on a video-feedback panel.

A Pi Control Program

Written in C++ on the receiving end, was a Raspberry Pi targeted control program for the quadcopter. Opening the Pi to auto-accept incoming Wi-Fi Direct communications was quite a challenge, due to the nature of Debian's network structure, and involved extensive research, adjustments, and tests to determine the correct command and event response topology to get it to work. Once communications were established, the control program would take in the data from the phone app, and would transmit it to the flight controller over UART serial lines. For successful communication, it was necessary to replicate one of the protocols accepted by the flight controller, which needed configuration itself.

gRPC Integration

To improve network throughput, thus reducing latency, Google's Remote Procedure Call (gRPC) library was obtained and applied to both the phone app and the Pi control program. gRPC used what's known as "protobufs" which was a language used to easily define protocols that could be compiled to work in a variety of languages. These were used to create protocols for transmitting and receiving control updates and status updates.

RTSP Server

For the final part of the project, the camera mounted on the quadcopter needed to be able to transmit video to the user's phone, where it could be recorded. To do this, the Pi was configured to launch a Remote Transport Service Protocol (RTSP) server that would use .h264 video protocol that could be connected to over the network to view livestreamed video. The phone, over the Wi-Fi Direct communication, connected to this stream, and was configured to be able to record it to the local device.

Final Presentation

As the final task for the project, I was required to present my project to judges and to the public on Senior Design Night. For this, I created a Poster, Presentation Video, Final Report Document, and a User Manual. My project, the Gesture Aerial Camera, was judged to be tied for third place of all the projects for that semester.