DTSC - GTSA - Comunicaciones en congresos y otros eventoshttp://hdl.handle.net/10016/262092018-10-20T11:30:30Z2018-10-20T11:30:30ZSaddlepoint Approximation of the Error Probability of Binary Hypothesis TestingVázquez Vilar, GonzaloGuillén i Fàbregas, AlbertKoch, Tobias MircoLancho Serrano, Alejandrohttp://hdl.handle.net/10016/275252018-10-04T00:15:23Z2018-08-16T00:00:00ZSaddlepoint Approximation of the Error Probability of Binary Hypothesis Testing
Vázquez Vilar, Gonzalo; Guillén i Fàbregas, Albert; Koch, Tobias Mirco; Lancho Serrano, Alejandro
We propose a saddlepoint approximation of the error probability of a binary hypothesis test between two i.i.d. distributions. The approximation is accurate, simple to compute, and yields a unified analysis in different asymptotic regimes. The proposed formulation is used to efficiently compute the meta-converse lower bound for moderate block-lengths in several cases of interest.
2018-08-16T00:00:00ZDesign of Discrete Constellations for Peak-Power-Limited complex Gaussian ChannelsHuleihel, WasimGoldfeld, ZivKoch, Tobias MircoMadiman, MokshayMédard, MurielHuleihel, Wasimhttp://hdl.handle.net/10016/275212018-10-05T12:05:01Z2018-08-16T00:00:00ZDesign of Discrete Constellations for Peak-Power-Limited complex Gaussian Channels
Huleihel, Wasim; Goldfeld, Ziv; Koch, Tobias Mirco; Madiman, Mokshay; Médard, Muriel; Huleihel, Wasim
The capacity-achieving input distribution of the complex Gaussian channel with both average- and peak-power constraint is known to have a discrete amplitude and a continuous, uniformly-distributed, phase. Practical considerations, however, render the continuous phase inapplicable. This work studies the backoff from capacity induced by discretizing the phase of the input signal. A sufficient condition on the total number of quantization points that guarantees an arbitrarily small backoff is derived, and constellations that attain this guaranteed performance are proposed.
Proceeding of: IEEE International Symposium on Information Theory (ISIT 2018)
2018-08-16T00:00:00ZNormal approximations for fading channelsLancho Serrano, AlejandroKoch, Tobias MircoDurisi, Giuseppehttp://hdl.handle.net/10016/275142018-10-03T00:16:01Z2018-05-24T00:00:00ZNormal approximations for fading channels
Lancho Serrano, Alejandro; Koch, Tobias Mirco; Durisi, Giuseppe
Capacity and outage capacity characterize the maximum coding rate at which reliable communication is feasible when there are no constraints on the packet length. Evaluated for fading channels, they are important performance benchmarks for wireless communication systems. However, the latency of a communication system is proportional to the length of the packets it exchanges, so assuming that there are no constraints on the packet length may be overly optimistic for communication systems with stringent latency constraints. Recently, there has been great interest within the information theory community in characterizing the maximum coding rate for short packet lengths. Research on this topic is often concerned with asymptotic expansions of the coding rate with respect to the packet length, which then give rise to normal approximations. In this paper, we review existing normal approximations for single-antenna Rayleigh block-fading channels and compare them with the high-SNR normal approximation we presented at the 2017 IEEE International Symposium on Information Theory (Lancho, Koch, and Durisi, 2017). We further discuss how these normal approx- imations may help to assess the performance of communication protocols.
Proceeding of: 52nd Annual Conference on Information Sciences and Systems (CISS 2018)
2018-05-24T00:00:00ZOn the information dimension rate of stochastic processesGeiger, BernhardKoch, Tobias Mircohttp://hdl.handle.net/10016/262342018-10-05T12:09:42Z2017-08-15T00:00:00ZOn the information dimension rate of stochastic processes
Geiger, Bernhard; Koch, Tobias Mirco
Jalali and Poor ("Universal compressed sensing," arXiv:1406.7807v3, Jan. 2016) have recently proposed a generalization of Rényi's information dimension to stationary stochastic processes by defining the information dimension of the stochastic process as the information dimension of k samples divided by k in the limit as k →∞ to. This paper proposes an alternative definition of information dimension as the entropy rate of the uniformly-quantized stochastic process divided by minus the logarithm of the quantizer step size 1/m in the limit as m →∞ ; to. It is demonstrated that both definitions are equivalent for stochastic processes that are ψ*-mixing, but that they may differ in general. In particular, it is shown that for Gaussian processes with essentially-bounded power spectral density (PSD), the proposed information dimension equals the Lebesgue measure of the PSD's support. This is in stark contrast to the information dimension proposed by Jalali and Poor, which is 1 if the process's PSD is positive on a set of positive Lebesgue measure, irrespective of its support size.
Proceeding of: 2017 IEEE International Symposium on Information Theory, Aachen, Germany, 25-30 June 2017
2017-08-15T00:00:00Z