Using Deep Learning to “read your thoughts” — with Keras and EEG

[image source: Greg Dunn]

Justin Alvey in his Medium article gives a demonstration of how to build a brain-computer interface that can “read your thoughts” using Keras and EEG.

He follows a simple approach that allows to use deep learning and EEG signals to perform a task such as recognizing words from brain signals.

Brain to direct computer communication started to gain more and more attention in the past several years. With the advances in sensor technology and especially in machine learning it is now possible to build a system that can recognize words from brain signals even at home.

Justin in his article explains the way to do that using commodity hardware and the Keras deep learning library.

He used an OpenBCI board, to extract EEG or more precisely (the weaker) EMG signals. Then he gathered training data for four words he is trying to sub-vocalize: enhance, stop, interlinked, cells.

He recorded data of sub-vocalized words for 20 minutes using the OpenBCI board and simple python interface. After the data collection, he did data cleaning and filtering to remove interference noise and normalize the data.

So, after cleaning up the data he built a convolutional neural network to perform the task of word recognition. In order to use a CNN he adapted the input layer so instead of 2 spatial dimensions he is only providing a single one and instead of 3 color channels he is providing 4 depth channels representing each EMG signal channel.

He showed that the model is able to classify EMG signals into 4 classes (the defined words). To validate the model’s performance he used a different set of data recorded with electrodes.

The complete demonstration can be found on his blog post.

Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments

aischool