Loading presentation...

Present Remotely

Send the link below via email or IM


Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.


e10bci enterface end presentation

No description

danny oude bos

on 31 August 2010

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of e10bci enterface end presentation

EEG & EOG CA & EM game state, feedback keyboard, mouse BCI User App Looking Around in a Virtual World Objective: to develop a smart camera for virtual worlds, based on covert attention, and eye movement (EEG and EOG). We will implement these pipelines, evaluate them offline, and design a mapping to camera movement, to culminate in on-line experiments to determine the usability and user experience. covert attention eye movement Implemented in:
Python, with numpy, scipy, matplotlib, cvxopt, golem, psychic

Reads from Actiview (tcp)
Deals with different markers, window sizes, window steps, preprocessing, pipelines -- for each pipeline
Writes pipeline results to any application that connects to it in standard json protocol (tcp) demo movie! natural interaction
familiar user task
predictable system reaction Thanks everyone! Danny Plass-Oude Bos
Matthieu Duvinage
Oytun Oktay
Jaime Delgado Saa
Huseyin Guruler
Aphan Istanbullu
Marijn van Vliet
Bram van de Laar
Mannes Poel
Ali Bahramisharif
Linsey Roijendijk
Luca Tonin
Boris Reuderink What is a good pipeline? NEW
Covert Attention from different fixation points! NEW
In-game CA training in immersive environment NEW
online CA in immersive environment NEW
evaluation of
EOG eye tracker on target basis NEW
EOG eye tracker for camera adjustment in VE NEW
Game that uses naturally occurring brain activity in a natural way NEW
Questionnaire to assess usability and UX of BCI system user evaluation questionnaire
10 sus
21 presence (control, immersion, naturalness, interface quality) [witmer]
5 bci (interface quality, control) future work protocol
covert attention clinical training
in-game training and game with EM; questionnaire
game with mouse; questionnaire research questions
difference in performance / brain activity for covert attention clinical training / in-game training / game?
difference in usability / ux for mouse / eye movement based camera adjustment? How well does it perform? detectable directions? varying fixations? distracting backgrounds? amount of trials? window size? in-game training? dataset 6 subjects
50 repetitions x
3 fixation points
4 focus directions + neutral blink detection pipeline CA1:
channels: O-P
window: 0.5-2.0s
9-11Hz STFT, z-score
SVM(0.1) pipeline CA2:
channels: O-P
window: 0.5-2.0s
CAR, whitening,
9-11Hz STFT, z-score
SVM(2.0) pipeline CA3:
channels: O-P
window: 0.5-2.0s
CAR, 8-14Hz
whitening, cov
logistic regression 58% 56% 67% (but matlab :/) 32% (37%) 4-class [CA2]
58% (65%) 2-class [CA2]
67% (77%) 2-class [CA3] left
36% center
45% right
36% pooled
40% test with subject o:
69% black background
72% game screenshot bg test with subject o:
similar performance NEW
automatic threshold selection pipeline performance dataset 4 subjects
10 eye blinks (2)
25 repetitions for
25 fixation points
Full transcript