My blogs reporting quantitative financial analysis, artificial intelligence for stock investment & trading, and latest progress in signal processing and machine learning

Tuesday, November 8, 2011

Updated T-MSBL code

I just now updated the T-MSBL/T-SBL code. So, using the updated version, you need NOT to consider the tuning of parameters for a general compressed sensing problem. By a general compressed sensing problem, I mean the columns of the matrix A has unit L2-norm. When your problem does not satisfy this, you can first transform your original problem:
Y = A X + V
to
Y = A W W^{-1} X + V  = A' X' + V
such that A' has unit-norm columns. Once you obtain the result, you can obtain X by X = W X'.


 The calling of T-MSBL is easy:


o   When noise is large (e.g. SNR <=6 dB)
X_est = TMSBL(A, Y, 'noise', 'large')

o   When noise is mild (e.g. 7 dB <= SNR <=22 dB)
X_est = TMSBL(A, Y, 'noise', 'mild')

o   When noise is small (e.g. SNR >22 dB)
X_est = TMSBL(A, Y, 'noise', 'small')

o   When no noise
                                     X_est = TMSBL(A, Y, 'noise', 'no')

But note that the above number 6dB or 22dB is not an exact value. The two values just give you a rough concept of what is the 'small noisy case', what is the 'mild noisy case', and what is the 'strongly noisy case'.
In this sense, this does not mean T-MSBL requires to know the noise level.

When you use T-MSBL in some practical problems when you really have no idea what is the range of noise strength (such as gene feature extraction), simply use the calling corresponding to the 'mild noise case', i.e.
X_est = TMSBL(A, Y, 'noise', 'mild') 


I will update the code in the near future, such that in any case(noisy, noiseless, real variable, complex variable, large-scale data or small-scale data)  you only need to input X_est = TMSBL(A,Y). But I currently am very busy on my on-going papers (four journal papers in three fields), so please forgive me that I cannot do this now.







Friday, November 4, 2011

Minisymposium on New Dimensions in Brain-Machine Interfaces at UCSD

Wednesday, November 9, 2011
1pm-6pm
Fung Auditorium
Powell-Focht Bioengineering Hall
UC San Diego

The minisymposium highlights latest advances and emerging directions in
brain-machine and neuron-silicon interface technology and their
applications to neuroscience and neuroengineering.  Topics include
high-dimensional EEG and ECoG systems, wireless and unobtrusive
brain-machine interfaces, flexible bioelectronics, real-time decoding of
brain and motor activity, and signal processing methods for intelligent
human-system interfaces.


PROGRAM

1:00-1:10pm    Welcome

1:10-1:50pm    Engineering hope with biomimetic systems
              Wentai Liu, UC Santa Cruz

1:50-2:30pm    A low power system-on-chip design for real-time ICA based BCI applications
              Wai-Chi Fang, National Chiao-Tung University, Taiwan

2:30-3:10pm    Developing practical non-contact EEG electrodes
              Yu Mike Chi, Cognionics

3:10-3:50pm    A new platform for BCI: from iBrain to the Stephen Hawking project
              Philip Low, Neurovigil


3:50-4:20pm    Coffee break


4:20-5:00pm    Interdisciplinary approaches to design high performance brain-machine interfaces
              Todd P. Coleman, UC San Diego

5:00-5:40pm    Evolving data collection and signal processing methods for intelligent human-system interfaces
              Scott Makeig, UC San Diego

5:40-6:00pm    Panel discussion


Organized by:

Tzyy-Ping Jung <tpjung@ucsd.edu>
Center for Advanced Neurological Monitoring,
Institute of Engineering in Medicine <http://iem.ucsd.edu>, and
Institute for Neural Computation <http://inc.ucsd.edu>

With support from:

Qualcomm <http://www.qualcomm.com>, and
Brain Corporation <http://www.braincorporation.com>