Space-time is a profound concept in physics. This concept was shown to be useful for dimensionality reduction. We present basic definitions with interesting counter-intuitions. We give theoretical propositions to show that space-time is a more powerful representation than Euclidean space. We apply this concept to manifold learning for preserving local information. Empirical results on nonmetric datasets show that more information can be preserved in space-time.
Details
Title
Space-time local embeddings
Author(s)
Sun, Ke (Viper Group, Computer Vision and Multimedia Laboratory, University of Geneva) Wang, Jun (Expedia Inc.) Kalousis, Alexandros (Haute école de gestion de Genève, HES-SO Haute Ecole Spécialisée de Suisse Occidentale) Marchand-Maillet, Stéphane (Viper Group, Computer Vision and Multimedia Laboratory, University of Geneva)
Date
2015-12
Published in
Advances in Neural Information Processing Systems 28 : Annual Conference on Neural Information Processing Systems 2015, December 7-12, 2015, Montreal, Quebec, Canada
Publisher
Montréal, Canada, 11th December 2015
Pagination
9 p.
Presented at
Time Series Workshop of the 29th Neural Information Processing Systems conference, NIPS-2015, Montréal, Canada, 11/12/2015
Paper type
full paper
Faculty
Economie et Services
School
HEG - Genève
Institute
CRAG - Centre de Recherche Appliquée en Gestion