Stanford 50: State of the Art and Future Directions of Computational Mathematics and Numerical Computing


  • March 29, 2007
  • 12:15 pm - 12:40 pm

The impact of numerical linear algebra in computational biomedical signal processing

Sabine Van Huffel (Katholieke Universiteit Leuven)

In biomedical signal processing, the aim is to extract clinically, biochemically or pharmaceutically relevant information (e.g., metabolite concentrations in the brain) in terms of parameters out of low-quality measurements in order to enable an improved medical diagnosis. Typically, biomedical data are affected by large measurement errors, largely due to the non-invasive nature of the measurement process or the severe constraints to keep the input signal as low as possible for safety and bioethical reasons. Accurate and automated quantification of this information requires an ingenious combination of the following issues:

  • an adequate pretreatment of the data,
  • the design of an appropriate model and model validation,
  • a fast and numerically robust model parameter quantification method,
  • an extensive evaluation and performance study, using in-vivo and patient data, up to the embedding of the advanced tools into user-friendly interfaces to be used by clinicians.

The underlying computational signal processing problems can be solved by making use of linear algebra, signal processing, system theory and optimisation. In particular, it is shown how computational linear algebra kernels, such as the Singular Value Decomposition (SVD), Principal Component Analysis (PCA), Canonical Correlation Analysis (CCA), Least Squares, Total Least Squares, Independent Component Analysis (ICA), ..., can be used as building blocks for higher-level signal processing algorithms. In addition, the application of these algorithms and their benefits will be briefly illustrated in a variety of case studies, including Magnetic Resonance Spectroscopic Imaging and epileptic seizure detection.

Stanford University Home Page