Geiger, Bernhard C.Koch, Tobias Mirco2019-09-182019-09-182019-10IEEE Transactions on Information Theory, 65(10), Pp. 6496 - 65180018-94481557-9654 (online)http://hdl.handle.net/10016/28825In 1959, Rényi proposed the information dimension and the d-dimensional entropy to measure the information content of general random variables. This paper proposes a generalization of information dimension to stochastic processes by defining the information dimension rate as the entropy rate of the uniformly quantized stochastic process divided by minus the logarithm of the quantizer step size 1/m in the limit as m to infty. It is demonstrated that the information dimension rate coincides with the rate-distortion dimension, defined as twice the rate-distortion function R(D) of the stochastic process divided by -log (D) in the limit as D downarrow 0 . It is further shown that among all multivariate stationary processes with a given (matrix-valued) spectral distribution function (SDF), the Gaussian process has the largest information dimension rate and the information dimension rate of multivariate stationary Gaussian processes is given by the average rank of the derivative of the SDF. The presented results reveal that the fundamental limits of almost zero-distortion recovery via compressible signal pursuit and almost lossless analog compression are different in general.23eng© 2019 IEEE.EntropyGaussian processInformation dimensionRate-distortion dimensionOn the Information Dimension of Stochastic Processesresearch articleElectrónicaTelecomunicacioneshttps://doi.org/10.1109/TIT.2019.2922186open access6496106518IEEE Transactions on Information Therory65AR/0000023946