Justel, AnaPeña, DanielUniversidad Carlos III de Madrid. Departamento de Estadística2009-05-132009-05-131995-06https://hdl.handle.net/10016/4203This paper discusses the convergence of the Gibbs sampling algorithm when it is applied to the problem of outlier detection in regression models. Given any vector of initial conditions, theoretically, the algorithm converges to the true posterior distribution. However, the speed of convergence may slow down in a high dimensional parameter space where the parameters are highly correlated. We show that the effect of the leverage in regression models makes very difficult the convergence of the Gibbs sampling algorithm in sets of data with strong masking. The problem is illustrated in several examples.application/pdfengAtribución-NoComercial-SinDerivadas 3.0 EspañaBayesian analysisLeverageLinear regressionScale contaminationGibbs sampling will fail in outlier problems with strong maskingworking paperEstadísticaopen access