This document describes a brain-computer interface system that uses steady-state visually evoked potentials detected by electroencephalography to control a drone. The system uses five flashing lights at different frequencies to elicit neural responses, which are classified using recursive least squares adaptive filtering and canonical correlation analysis to map the responses to commands to control drone movement. The system was able to successfully discriminate between the five frequencies and allow a user to control the drone within 5-10 seconds using their brain signals.