EmoCopter

This project aims to create an interface between the Emotiv Epoc EEG headset and the Parrot AR.Drone.

It was started in september 2012 by Vincent de Marignac.

You can find the git repo here: https://github.com/sinlab-semester-2012/EmoCopter

And the concept presentation video here: http://www.youtube.com/watch?v=DrO5bB_H1tw

Attention developer! this project has only been tested on a linux environment so far!!

 


 

Setup

Emotiv EPOC EEG headset

Parrot AR.Drone

 

 

 


 

Project structure

The project is clearly separated in two subparts: the server and the client.

The server is written in C and mainly uses the emokit library and libOSC to extract data from the emotiv epoc headset and send it via osc to a local port.

The client is where I chose to locate all the interesting parts. I set up a java structure for the emotiv headset, which consists of a frame and sensors.

The frame is essentially a set of sensors, it holds a few information on its state and some accessibility methods. The sensors are of two types: either a simple Sensor, which will mainly hold information other than a brain wave signal; or an EEGCap, which, you guessed it, holds the current value of a particular cap.

I started implementing java-ml into this structure, so that machine learning can be easily used.

 


 

Where to start

First you'll need to download the project at the aforementioned github repo.

Then make sure all required dependencies are met. For more information read the github README.

If you are working on a linux machine, there is good chance that if you try to build the project it will work. To do that, launch the shell script called 'build.sh'. If everything goes smoothly it'll first look for dependencies, then build libraries and after that ask you for your root password. Provided that, it will install emokit_osc on your machine.

In the case you are not using a linux machine, I do not know whether this will work for you. So you probably will have to modify a few lines here and there just to build the project. For that reason and because you probably want to start fairly early working on real stuff, I'd stress working under Ubuntu linux.

Having installed emokit_osc, you can launch it with 'sudo emokit_osc' with the usb dongle from emotiv plugged into your computer. If you don't have any headset and dongle, you can still experiment using random data. To do that: 'sudo emokit_osc -n'

Now, emokit_osc sends osc packets to the port number 7000. You can use the java client program to get this data and process it. I would stress using eclipse for simplicity, so you'll need to add an existing project to your workspace.

In principle, I've included every necessary library on the client side inside the 'lib' folder. So no need to worry about dependency there.

 

If anything doesn't work and you really can't figure it out, I'll be happy to help. So just send me an email at 'vincent dot demarignac at epfl dot ch' and I'll gladly fill in the gaps.

 


 

How to use

Now that you're through with the uninteresting part, let's get to the real fun :)

I tried to sufficiently document my code so you should get the idea by reading that, which will come in handy when you'll have to correct mistakes I've made (not that I've made any ;) ) and maintain the code. It's pretty messy too sometimes, so stay focused :P

What you want to concentrate on is the client of course, and the core of this client is called 'oscardrone.java'. It was first written in processing, then I took it to java (implementing processing's classes), so it explains why the structure is processing-like (setup and draw methods). You will notice that osc packets are caught in an oscEvent method, where they are parsed and distributed over a set of various arrays and objects.

You can already control the AR.Drone version 1 with no further ado, but you'll probably have to work on some improvements to make it work smoothly with version 2.0. To do that, look at Nikita's project, in which he conveniently refactored the initial ardroneforP5 library we were given.

The interface provides a short description of each available command, for example pressing '0' will show all info and 'p' will display the plots.

 


 

What's next?

Well, as you might have guessed, this project is far from being finished. Here I will briefly describe how I envision the next steps.

There are a few very critical subjects that need to be addressed:

You guessed it: a lot remains to be done, and that includes a variety of fields: neurology, signal processing and machine learning. But the real next step will be testing for other classifier algorithms, improving the code and GUI, removing noise, and extracting good features.

 

 


 

P300 study

I investigated the P300 case and concluded that this method was not sufficiently fast to satisfy the need to control a drone.

P300 is a potential arising as a response to an external stimulus, it can be either visual or auditory. A well known method using P300 is the P300 speller, which consists of flashing letters and numbers in rows or columns. When the right column or row flashes, your brain produces this recognizable potential after a 'small' delay of 300 milliseconds. This means that for every command you have to wait until at the column and the row have flashed at least once. It is inconvenient because when controlling a drone you want it to react fast to prevent bumping into walls and so on.

Other studies on this topic report that a typical latency is around 5 (at best) to 10 seconds, but still are able to achieve 100% success rate.

 


 

A note from the first developer

First of all I'd like to thank SINLAB and LDM for giving me the opportunity to work on this project, because even if I didn't finish it the way I (and they) expected, I think being able to lay the basis is a wonderful thing and I had a wonderful time experimenting. Thanks also to all those who helped me directly or indirectly, they were of great moral and technical support.

I really do hope and believe this project will come to a conclusive end, even though the scope was larger than previously thought and doubts regarding the feasibility started arising among us.

 


 

Similar work

FlyingBuddy2: http://www.youtube.com/watch?v=JNCsUJt99YM

Swarm Extreme: http://www.youtube.com/watch?v=2hHRDyeV0TQ

Puzzlebox Orbit: http://brainstorms.puzzlebox.info/

 


 

Useful links

EEG wiki: http://en.wikipedia.org/wiki/Electroencephalography

P300 detection: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=4344360

java-ml: http://java-ml.sourceforge.net/