Research

Learning from Demonstration


NASA Space Robotics Challenge


Undergraduate Research

Chuck


AIR lab was recently awarded a $5000 grant to develop an embodied cognitive system capable of learning to play corn hole, a lawn game where players switch off throwing beanbags at angled platforms separated by a few meters. Thus far I have modeled and simulated a single bean bag trajectory and collision with the platform, and I am beginning to evaluate the performance of TD learning, a reinforcement learning model.

EMG


The video above shows me controlling a robotic arm with electromyography (EMG) signals. EMG signals are acquired from the bicep, tricep, anterior deltoid, and posterior deltoid. Initially, I acquired these signals at three different positions for easy classification. The Nearest Neighbors machine learning algorithm classifies the new data set and sends servo positions to a microcontroller for the robotic arm.

This application is important for precision control in telerobotics. Because of the stochastic nature of the world, it is unrealistic to develop an optimal system that can consider every situation. The ability to control parts of the robot in critical situations i.e bomb defusal, hazardous waste removal, or medical surgeries, may be the difference between the survival of individuals.

The main focus of my research was to compare multiple supervised machine learning algorithms to see which would perform the best i.e most accurate and speed. With the three initial arm positions, I was able to compare Nearest Neighbors, Random Forests, Naive Bayes, j48 decision tress, logistic regression, and support vector machine. The results showed Nearest Neighbors (L2 distance, 1/d weight, k=4) performed with the highest accuracy as well as sub millisecond classification. The paper regarding this research has been accepted to the AAAI 2016 conference.

Oculus


The above videos show myself and Antonio Sestito controlling a three-axis gimbal with the Oculus Rift. In parallel to the EMG project, we worked with the Oculus Rift to improve another aspect of telerobotic control. Instead of the operator wasting valuable resources such as their hands, they can naturally control the cameras using their head. Our hypothesis was that we would be able to improve the operator's response characteristics by replacing a mouse for the Oculus Rift.

We were able to take advantage of the internal inertial measuring unit within the Oculus Rift to determine the position as well as the velocity and acceleration of the operator's head. As seen in the second video, we were able to send a video feed to the operator for an immersive feel.

The results of our research supported our hypothesis in that the operator's response characteristics improved with the Oculus as compared to their response with a computer mouse. The paper regarding this research has been accepted for publication to the ASME IMECE 2015 conference.