My experience with a smartphone brainscanner
|Self researcher(s)||Jakob Eg Larsen|
|Related tools||EEG (Electroencephalography), phone|
|Related topics||Mood and emotion, Cognition, Brain activity|
Builds on project(s)
|Show and Tell Talk Infobox|
|Event name||2011 QS Europe Conference|
|This content was automatically imported. See here how to improve it if any information is missing or out outdated.|
My experience with a smartphone brainscanner is a Show & Tell talk by Jakob Eg Larsen that has been imported from the Quantified Self Show & Tell library.The talk was given on 2011/11/26 and is about Mood and emotion, Cognition, and Brain activity.
Description[edit | edit source]
A description of this project as introduced by Quantified Self follows:
Jakob Larsen and his team in the Mobile Informatics Lab at the Technical University of Denmark have developed a way to build a real-time 3-D model of your brain using a smartphone and the Emotiv EPOC game controller headset. In this ignite talk, Jakob describes his experience with a smartphone brain scanner, and how the fourteen sensors of his mobile EEG device rival a traditional lab EEG setup.
Video and transcript[edit | edit source]
My experience with a smartphone brainscanner by Jakob Eg Larsen
I’m very excited to be here to be able to demo and tell you about our experience in developing this smart phone brain scanner. This is work that is done at the (technical ? 00:25), of Denmark, and you can see the team members there. So the smart scanner essentially allows you to sort of hold your brain in the palm of your hand. So what I have here on the smart phone and you can see on the display is a 3-D model of your brain, and then I am using this (e-pop?) Emotion E-pop headset, which is a 14 channel neuro headset and the data is then transmitted directly to the smart phone. The neuro headset is meant to actually be a game controller for your PC. So it comes through a small USB receiver that you can put into your PC, and some training software, and if you want a short training session it actually allows you to control an object displayed on your PC. But what we have done in the lab and developed is to connect it directly to the smart phone. Then what we are actually doing is we have an app on the smart phone that is in real time analyzing the data. And what we do is we get this data from the 14 different positions on the scalp. We get the information and analyze it, and through a sophisticated model we are actually trying to base on the 14 positions is to estimate the source in the brain where the brain activity occurs. Then we are mapping that onto the 3-D visualization that you see on the screen. It is then showing in real time, very shortly in something like 150 milliseconds and it will actually show you the brain activity as they occur in the brain. I think this is really exciting that we can do this and we can do two things. We can monitor and give real-time feedback on the smart phone by analyzing this stuff, but we can also do EG recordings over longer time durations. And if you look at what it looks like in a traditional EG monitoring set up in a lab like this. It is very time consuming to set up all of these wires, and it really limits the user in what he or she can do in terms of mobility, especially in the kinds of tasks that you can carry out while you are wearing this. But, with the mobile smartphone brainscanner you are completely wireless, and you can move around in your natural environment and do the recordings. This is a few shots from the Danish national TV where they originally did a small experiment, where we had a test person trying to where this to actually see what are the decisions that occur when you are out shopping, and what decides what you are going to shop. It allows a lot of things, because it is fairly low-cost equipment, we can do it on a a lot of people and as tech crunch, you can actually do some very creative things, and he is the team that has developed this thing. So I think this is really exciting, because you know we are here and we are talking about all of these things that we can capture about ourselves, all the data that we can get from the sensors. The steps we take, we can do mood, tracking, heart rate and on and on and on and collect a lot of data. But I think it is really interesting to try and we need to put this into context and need to understand what we are encountering, especially I think it is important if we can involve something like this to figure out what is going on in the brain which is the one that is sort of driving the decisions that we make. Maybe then we can try to understand the stuff that is actually driving the potential behaviour change. I think this is really exciting stuff, and we are doing this as a university project and we want to make it available to others to play with this. The version I have here is a Nokia 900 for the Maemo Platform, and if you happen to have such a device. You can actually download the software for the phone and try it for yourself. Just recently we have been able to support to also run on other mobile platforms, so this is now also running on LINUX-based platforms, and including android platforms. So you can potentially run this on an android phone and tablets, and if you want to know more, please visit our lab website and you can get more information, or you can Tweet me.
With those words thank you for your attention.
About the presenter[edit | edit source]
Jakob Eg Larsen gave this talk.