Alleviating sequence information loss with data overlapping and prime batch sizes

Kocher, Noémien (School of Engineering and Architecture (HEIA-FR), HES-SO // University of Applied Sciences Western Switzerland) ; Scuito, Christian (Ecole Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland) ; Tarantino, Lorenzo (Ecole Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland) ; Lazaridis, Alexandros (Swisscom, Switzerland) ; Fischer, Andreas (School of Engineering and Architecture (HEIA-FR), HES-SO // University of Applied Sciences Western Switzerland ; University of Fribourg, Fribourg, Switzerland) ; Musat, Claudiu (Swisscom, Switzerland)

In sequence modeling tasks the token order matters, but this information can be partially lost due to the discretization of the sequence into data points. In this paper, we study the imbalance between the way certain token pairs are included in data points and others are not. We denote this a token order imbalance (TOI) and we link the partial sequence information loss to a diminished performance of the system as a whole, both in text and speech processing tasks. We then provide a mechanism to leverage the full token order information—Alleviated TOI—by iteratively overlapping the token composition of data points. For recurrent networks, we use prime numbers for the batch size to avoid redundancies when building batches from overlapped data points. The proposed method achieved state of the art performance in both text and speech related tasks.


Conference Type:
full paper
Faculty:
Ingénierie et Architecture
School:
HEIA-FR
Institute:
iCoSys - Institut des systèmes complexes
Publisher:
Hong Kong, China, 3-4 November 2019
Date:
2019-11
Hong Kong, China
3-4 November 2019
Pagination:
10 p.
Published in:
Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), 3-4 November 2019, Hong Kong, China
Numeration (vol. no.):
pp. 890-899
DOI:
Appears in Collection:



 Record created 2020-01-07, last modified 2020-04-28

Fulltext:
Download fulltext
PDF

Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)