Sufficient Dimensionality Reduction for Sequence Classification
The Sufficient Dimensionality Reduction (SDR) framework seeks to find a latent subspace that captures as much information of the covariates with respect to the output labels via conditional independence. As a particular instance of SDR, the kernel dimensionality reduction (KDR) algorithm achieves the conditional independence through minimizing the trace of the cross covariance operator. We propose to extend the existing framework for static data to include sequential information through kernel design and dynamic time warping. Both techniques capture periodicity in the data and impose proximity for data points which are temporally close. The result is a dimensionality reduction technique that is suitable for capturing dynamics in data, enabling more efficient classifiers to be built on top of the latent feature space. Our approach is tested on a variety of computer vision datasets from human activities to dynamic textures. To learn more about the project, contact Alex Shyr.
