Bayesian multi-task learning for decoding multi-subject neuroimaging data

Research output: Contribution to journalArticlepeer-review

44 Citations (Scopus)

Abstract

Decoding models based on pattern recognition (PR) are becoming increasingly important tools for neuroimaging data analysis. In contrast to alternative (mass-univariate) encoding approaches that use hierarchical models to capture inter-subject variability, inter-subject differences are not typically handled efficiently in PR. In this work, we propose to overcome this problem by recasting the decoding problem in a multi-task learning (MTL) framework. In MTL, a single PR model is used to learn different but related "tasks" simultaneously. The primary advantage of MTL is that it makes more efficient use of the data available and leads to more accurate models by making use of the relationships between tasks. In this work, we construct MTL models where each subject is modeled by a separate task. We use a flexible covariance structure to model the relationships between tasks and induce coupling between them using Gaussian process priors. We present an MTL method for classification problems and demonstrate a novel mapping method suitable for PR models. We apply these MTL approaches to classifying many different contrasts in a publicly available fMRI dataset and show that the proposed MTL methods produce higher decoding accuracy and more consistent discriminative activity patterns than currently used techniques. Our results demonstrate that MTL provides a promising method for multi-subject decoding studies by focussing on the commonalities between a group of subjects rather than the idiosyncratic properties of different subjects.
Original languageEnglish
Article numberN/A
Pages (from-to)298-311
Number of pages14
JournalNeuroImage
Volume92
Issue numberN/A
DOIs
Publication statusPublished - 15 May 2014

Fingerprint

Dive into the research topics of 'Bayesian multi-task learning for decoding multi-subject neuroimaging data'. Together they form a unique fingerprint.

Cite this