Enhanced pilot engagement level using the brain machine interface (BMI) in live flight control

Presenter Information

Mark Thivierge
Kyle Mott
Redwan Alqasemi

Loading...

Media is loading
 

Comments

Poster Presentation

Please select your campus affiliation

Tampa

Mentor Information

Redwan Alqasemi

Description

Critical battlefield missions require proficient physical and cognitive capabilities for situational awareness and active decision making. Current technologies allow pilots operating unmanned flights to utilize non-invasive electroencephalography (EEG)-based Brain-Machine Interfaces (BMIs) to capture and convert brain signals into actions used for commands and flight control. This project aims to utilize a motor imagery-based BMI system to extract, filter, and condition the pilot's brain signal to control a flight target destination. Additional algorithms were developed to extract GPS data, map the target location to global coordinates using homogeneous transformation matrices, and autonomously navigate to and follow the selected target. Moreover, a graphical user interface (GUI) was developed to provide the user with visual feedback from the flight onboard camera and display commands that can be initiated using the EEG signal. A drone is used for testing and data collection, and the results show that the developed system was able to use the brain's raw BMI data and filter it to (1) mental commands using filtered EEG signal, and (2) facial expression commands using extracted EMG and EOG data. The latter was easier to use and more accurate in conveying the pilot's intentions. Pilot training was conducted for both control categories, and a full control of the drone using brain signals and reaching the target location was performed with an average error of 0.1%. The accuracy of control using brain signal through the BMI was 57.89% at the start of control session, and it decreased to 46.34% after 30 minutes of use.

This document is currently not available here.

Share

COinS
 

Enhanced pilot engagement level using the brain machine interface (BMI) in live flight control

Critical battlefield missions require proficient physical and cognitive capabilities for situational awareness and active decision making. Current technologies allow pilots operating unmanned flights to utilize non-invasive electroencephalography (EEG)-based Brain-Machine Interfaces (BMIs) to capture and convert brain signals into actions used for commands and flight control. This project aims to utilize a motor imagery-based BMI system to extract, filter, and condition the pilot's brain signal to control a flight target destination. Additional algorithms were developed to extract GPS data, map the target location to global coordinates using homogeneous transformation matrices, and autonomously navigate to and follow the selected target. Moreover, a graphical user interface (GUI) was developed to provide the user with visual feedback from the flight onboard camera and display commands that can be initiated using the EEG signal. A drone is used for testing and data collection, and the results show that the developed system was able to use the brain's raw BMI data and filter it to (1) mental commands using filtered EEG signal, and (2) facial expression commands using extracted EMG and EOG data. The latter was easier to use and more accurate in conveying the pilot's intentions. Pilot training was conducted for both control categories, and a full control of the drone using brain signals and reaching the target location was performed with an average error of 0.1%. The accuracy of control using brain signal through the BMI was 57.89% at the start of control session, and it decreased to 46.34% after 30 minutes of use.