RT Conference Proceedings T1 A saliency-based attention LSTM model for cognitive load classification from speech A1 Gallardo Antolín, Ascensión A1 Montero Martínez, Juan Manuel AB Cognitive Load (CL) refers to the amount of mental demand that a given task imposes on an individual's cognitive system and it can affect his/her productivity in very high load situations. In this paper, we propose an automatic system capable of classifying the CL level of a speaker by analyzing his/her voice. Our research on this topic goes into two main directions. In the first one, we focus on the use of Long Short-Term Memory (LSTM) networks with different weighted pooling strategies for CL level classification. In the second contribution, for overcoming the need of a large amount of training data, we propose a novel attention mechanism that uses the Kalinli's auditory saliency model. Experiments show that our proposal outperforms significantly both, a baseline system based on Support Vector Machines (SVM) and a LSTM-based system with logistic regression attention model. PB International Speech Communication Association (ISCA) YR 2019 FD 2019 LK https://hdl.handle.net/10016/31660 UL https://hdl.handle.net/10016/31660 LA eng NO Proceeding of: Interspeech 2019, 20th Annual Conference of the International Speech Communication Association, Graz, Austria, 15-19 September 2019 NO The work leading to these results has been partly supported by Spanish Government grants TEC2017-84395-P and TEC2017-84593-C2-1-R. DS e-Archivo RD 1 jul. 2024