RT Journal Article T1 Improved Method to Select the Lagrange Multiplier for Rate-Distortion Based Motion Estimation in Video Coding A1 González de Suso Molinero, José Luis A1 Jiménez Moreno, Amaya A1 Martínez Enríquez, Eduardo A1 Díaz de María, Fernando AB The motion estimation (ME) process used in the H.264/AVC reference software is based on minimizing a cost function that involves two terms (distortion and rate) that are properly balanced through a Lagrangian parameter, usually denoted as lambda(motion). In this paper we propose an algorithm to improve the conventional way of estimating lambda(motion) and, consequently, the ME process. First, we show that the conventional estimation of lambda(motion) turns out to be significantly less accurate when ME-compromising events, which make the ME process to perform poorly, happen. Second, with the aim of improving the coding efficiency in these cases, an efficient algorithm is proposed that allows the encoder to choose between three different values of lambda(motion) for the Inter 16x16 partition size. To be more precise, for this partition size, the proposed algorithm allows the encoder to additionally test lambda(motion) = 0 and lambda(motion) arbitrarily large, which corresponds to minimum distortion and minimum rate solutions, respectively. By testing these two extreme values, the algorithm avoids making large ME errors. The experimental results on video segments exhibiting this type of ME-compromising events reveal an average rate reduction of 2.20% for the same coding quality with respect to the JM15.1 reference software of H.264/AVC. The algorithm has been also tested in comparison with a state-of-the-art algorithm called context adaptive Lagrange multiplier. Additionally, two illustrative examples of the subjective performance improvement are provided. PB IEEE SN 1051-8215 YR 2014 FD 2014-03 LK https://hdl.handle.net/10016/21469 UL https://hdl.handle.net/10016/21469 LA eng NO This work has been partially supported by the National Grant TEC2011-26807 of the Spanish Ministry of Science and Innovation. DS e-Archivo RD 19 may. 2024