Space-time is a profound concept in physics. This concept was shown to be useful for dimensionality reduction. We present basic definitions with interesting counter-intuitions. We give theoretical propositions to show that space-time is a more powerful representation than Euclidean space. We apply this concept to manifold learning for preserving local information. Empirical results on nonmetric datasets show that more information can be preserved in space-time.
Détails
Titre
Space-time local embeddings
Auteur(s)/ trice(s)
Sun, Ke (Viper Group, Computer Vision and Multimedia Laboratory, University of Geneva) Wang, Jun (Expedia Inc.) Kalousis, Alexandros (Haute école de gestion de Genève, HES-SO Haute Ecole Spécialisée de Suisse Occidentale) Marchand-Maillet, Stéphane (Viper Group, Computer Vision and Multimedia Laboratory, University of Geneva)
Date
2015-12
Publié dans
Advances in Neural Information Processing Systems 28 : Annual Conference on Neural Information Processing Systems 2015, December 7-12, 2015, Montreal, Quebec, Canada
Editeur
Montréal, Canada, 11th December 2015
Pagination
9 p.
Présenté à
Time Series Workshop of the 29th Neural Information Processing Systems conference, NIPS-2015, Montréal, Canada, 11/12/2015
Type de papier
full paper
Domaine
Economie et Services
Ecole
HEG - Genève
Institut
CRAG - Centre de Recherche Appliquée en Gestion