Dictionary Learning with BLOTLESS Update

Qi Yu, Wei Dai, Zoran Cvetkovic, Jubo Zhu

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)
170 Downloads (Pure)

Abstract

Algorithms for learning a dictionary to sparsely represent a given dataset typically alternate between sparse coding and dictionary update stages. Methods for dictionary update aim to minimise expansion error by updating dictionary vectors and expansion coefficients given patterns of non-zero coefficients obtained in the sparse coding stage. We propose a block total least squares (BLOTLESS) algorithm for dictionary update. BLOTLESS updates a block of dictionary elements and the corresponding sparse coefficients simultaneously. In the error free case, three necessary conditions for exact recovery are identified. Lower bounds on the number of training data are established so that the necessary conditions hold with high probability. Numerical simulations show that the bounds approximate well the number of training data needed for exact dictionary recovery. Numerical experiments further demonstrate several benefits of dictionary learning with BLOTLESS update compared with state-of-the-art algorithms especially when the amount of training data is small.

Original languageEnglish
Article number8985423
Pages (from-to)1635-1645
Number of pages11
JournalIEEE Transactions on Signal Processing
Volume68
Early online date6 Feb 2020
DOIs
Publication statusPublished - 6 Feb 2020

Keywords

  • Dictionary learning
  • Inverse problems
  • Sparse representation

Fingerprint

Dive into the research topics of 'Dictionary Learning with BLOTLESS Update'. Together they form a unique fingerprint.

Cite this