Publication: Métodos MCMC para la inferencia Bayesiana
Loading...
Identifiers
Publication date
2013-09
Defense date
2013-09-16
Authors
Advisors
Tutors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
En muchas aplicaciones, se asume un modelo de generación de las observaciones. Cuando se
asume también ruido de medida, este modelo crea una función de verosimilitud. Dada unas
medidas u observaciones recibidas, entonces en general se conoce la Función de Densidad de
Probabilidad (fdp) de las mismas. La inferencia Bayesiana consiste en estimar parámetros
de la fdp a posteriori minimizando un coste previamente establecido.
El principal inconveniente es que para hallar estos estimadores es necesario, en
general, calcular integrales muy complejas, y típicamente en varias dimensiones. En la
mayoría de ocasiones esto no es posible de forma analítica. Una alternativa de aproximación
estocástica para superar este inconveniente son los métodos MCMC (Markov Chain Monte
Carlo) basados en métodos de Monte Carlo. El método MCMC que se estudiar a en este
proyecto es el método Metropolis Hastings.
El algoritmo Metropolis Hastings genera una serie de muestras a partir de cadenas de
Markov que converge hacia la densidad objetivo. La principal limitación de este algoritmo
es que la cadena generada puede quedarse atrapada en un sub-región de alta probabilidad
de la densidad objetivo, como puede ser una moda, y no representar fielmente la fdp que
se desee. Para evitar esto, existen métodos avanzados basados en el algoritmo Metropolis
Hastings. Estos métodos consisten principalmente en realizar algoritmos Metropolis Hastings
en paralelo.
Los métodos avanzados que se estudiarén en este proyecto son el método Multiple Try
Metropolis (MTM), y Metropolis Hastings Delayed Rejection (MHDR). El primero consiste
en generar varias muestras candidatas por iteración y aceptar la más adecuada, mientras el
método Metropolis Hastings solo generaba un candidato por iteración. El método MHDR
es similar al MTM, pero los candidatos se propondrán de uno en uno y solo se propondrá un
nuevo candidato si se ha rechazado el anterior. Ambos son algoritmos válidos que generan
cadenas reversibles, asegurando la convergencia a la densidad objetivo.
In several applications, is assumed a generative model of observations. When exists measurement noise, this model establishes a likelihood function. From received measures or observations, in the general case is known the Probability Density Function (pdf) of them. Bayesian inference consists in estimate parameters of the pdf minimizing a previously established cost. The main drawback of these estimators is the necessity of calculating complicated integrals, usually in several dimensions. An alternative to overcome this drawback are MCMC methods (Markov Chain Monte Carlo) based on Monte Carlo Methods. The MCMC method used in this project is the Metropolis Hastings method. The Metropolis Hastings algorithm generates samples from Markov chains converging to the target density. The main limitation of this algorithm is that the generated chain can become trapped in a subset of the density, such as a mode, and not estimate faithfully the pdf. To prevent this, there are advanced methods based on the Metropolis Hastings algorithm. These methods consist in performing several Metropolis Hastings algorithms in parallel. These advanced methods are Multiple Try Metropolis (MTM) and Metropolis Hastings Delayed Rejection (MHDR). MTM consists in generating several candidate samples per iteration, and choose the best one, while the Metropolis Hastings algorithm just generated one candidate sample per iteration. The MHDR method is similar to MTM, but propose candidates one by one and only propose a new candidate if the previous candidate has been rejected. Both are useful algorithms that generate reversible chains, ensuring convergence through the target density.
In several applications, is assumed a generative model of observations. When exists measurement noise, this model establishes a likelihood function. From received measures or observations, in the general case is known the Probability Density Function (pdf) of them. Bayesian inference consists in estimate parameters of the pdf minimizing a previously established cost. The main drawback of these estimators is the necessity of calculating complicated integrals, usually in several dimensions. An alternative to overcome this drawback are MCMC methods (Markov Chain Monte Carlo) based on Monte Carlo Methods. The MCMC method used in this project is the Metropolis Hastings method. The Metropolis Hastings algorithm generates samples from Markov chains converging to the target density. The main limitation of this algorithm is that the generated chain can become trapped in a subset of the density, such as a mode, and not estimate faithfully the pdf. To prevent this, there are advanced methods based on the Metropolis Hastings algorithm. These methods consist in performing several Metropolis Hastings algorithms in parallel. These advanced methods are Multiple Try Metropolis (MTM) and Metropolis Hastings Delayed Rejection (MHDR). MTM consists in generating several candidate samples per iteration, and choose the best one, while the Metropolis Hastings algorithm just generated one candidate sample per iteration. The MHDR method is similar to MTM, but propose candidates one by one and only propose a new candidate if the previous candidate has been rejected. Both are useful algorithms that generate reversible chains, ensuring convergence through the target density.
Description
Keywords
Inferencia bayesiana, Estadística bayesiana, Método de Monte Carlo, Procesos de Markov, Algoritmos, Funciones de densidad de probabilidad