2013 ISIT Plenary Lecture
High Dimensional Classification with Invariant Deep Networks
École Normale Supérieure
Intra-class variability is the curse of most high-dimensional classification problems. Fighting it means finding discriminative invariants. Classical mathematical invariants are either non-stable to signal variabilities or loose too much information. Surprisingly, non-linear deep neural networks became "hot" again, by accumulating experimental successes over a wide range of applications for speech, images and biological data. We show that such architectures build hierarchical invariants over cascades of Lie groups, which reduce signal variabilities while preserving discrimination. Invariants are computed with filters corresponding to wavelets defined on each group. They are learned from unsupervised data with sparse representation strategies, that remain to be understood. Applications will be discussed and shown on images and sounds.