Résumé

Humans are able to quickly adapt to new situations, learn effectively with limited data, and create unique combinations of basic concepts. In contrast generalizing out-of-distribution (OOD) data and achieving combinatorial generalizations are fundamental challenges for the machine learning models. To address these challenges, we propose BtVAE, a method that employs supervised conditional VAE models to achieve combinatorial generalization in certain scenarios and consequently to generate out-of-distribution (OOD) data. Unlike previous approaches that use new factors of variation during testing, our method uses only existing attributes from the training data, but in ways that were not seen during training (e.g., small objects during training and large objects during testing). We first learn a latent representation of the in-distribution inputs and we passing this representation in a conditional decoder, conditioning on some OOD attribute values, to generate implicit OOD samples. These generated samples are then translated back to the original in-distribution inputs, conditioning on the actual attribute values. To ensure that the generated OOD samples have the specified OOD attribute values, a predictor is introduced. By training with OOD attribute values the decoder learns to produce the correct output for unseen combinations, resulting in a model that not only is able to reconstruct OOD data but also to manipulate the OOD data and to generate samples conditioning on unseen combinations of attribute values.

Détails

Actions

PDF