Geiger, Bernhard C.Koch, Tobias Mirco2018-02-132018-02-132017-08-152017 IEEE International Symposium on Information Theory [Proceedings], pp. 888-892978-1-5090-4096-4https://hdl.handle.net/10016/26234Proceeding of: 2017 IEEE International Symposium on Information Theory, Aachen, Germany, 25-30 June 2017Jalali and Poor ("Universal compressed sensing," arXiv:1406.7807v3, Jan. 2016) have recently proposed a generalization of Rényi's information dimension to stationary stochastic processes by defining the information dimension of the stochastic process as the information dimension of k samples divided by k in the limit as k →∞ to. This paper proposes an alternative definition of information dimension as the entropy rate of the uniformly-quantized stochastic process divided by minus the logarithm of the quantizer step size 1/m in the limit as m →∞ ; to. It is demonstrated that both definitions are equivalent for stochastic processes that are ψ*-mixing, but that they may differ in general. In particular, it is shown that for Gaussian processes with essentially-bounded power spectral density (PSD), the proposed information dimension equals the Lebesgue measure of the PSD's support. This is in stark contrast to the information dimension proposed by Jalali and Poor, which is 1 if the process's PSD is positive on a set of positive Lebesgue measure, irrespective of its support size.5application/pdfeng© 2017 IEEEEntropyRate-distortionGaussian processesCompressed sensingEncodingOn the information dimension rate of stochastic processesconference paperElectrónicaTelecomunicacioneshttps://doi.org/10.1109/ISIT.2017.8006656open access8888922017 IEEE International Symposium on Information Theory [Proceedings]CC/0000027313