Research not for publishing papers, but for fun, for satisfying curiosity, and for revealing the truth.

This blog reports latest progresses in
(1) Signal Processing and Machine Learning for Biomedicine, Neuroimaging, Wearable Healthcare, and Smart-Home
(2) Sparse Signal Recovery and Compressed Sensing of Signals by Exploiting Spatiotemporal Structures
(3) My Works


Monday, February 6, 2012

A New Paper: Evolving Signal Processing for Brain-Computer Interface

We have a survey paper on BCI recently accepted by Proceedings of the IEEE (Special 100th Anniversary Issue):

Scott Makeig, Christian Kothe, Tim Mullen, Nima Bigdely-Shamlo, Zhilin Zhang, Kenneth Kreutz-Delgado, Evolving Signal Processing for Brain-Computer Interface, Proceedings of the IEEE, 2012

The paper surveys the past, the present, and the future of signal processing and machine learning in the cognitive state assessment especially BCI, wireless EEG, and mobile EEG.

The paper can be downloaded from here

Here is abstract:
Because of the increasing portability and wearability of noninvasive electrophysiological systems that record and process electrical signals from the human brain, automated systems for assessing changes in user cognitive state, intent, and response to events are of increasing interest. Brain-computer interface (BCI) systems can make use of such knowledge to deliver relevant feedback to the user or to an observer, or within a human-machine system to increase safety and enhance overall performance. Building robust and useful BCI models from accumulated biological knowledge and available data is a major challenge, as are technical problems associated with incorporating multimodal physiological, behavioral, and contextual data that may in future be increasingly ubiquitous. While performance of current BCI modeling methods is slowly increasing, current performance levels do not yet support widespread uses. Here we discuss the current neuroscientific questions and data processing challenges facing BCI designers and outline some promising current and future directions to address them.

No comments:

Post a Comment