Tags:Approximate Inference, Markov Chain Monte Carlo and Probabilistic Logic Programming
Abstract:
Markov Chain Monte Carlo (MCMC) methods are a class of algorithms used to perform approximate inference in probabilistic models. When direct sampling from a probability distribution is difficult, MCMC algorithms provide accurate results by constructing a Markov chain that gradually approximates the desired distribution. In this paper we describe and compare the performances of two MCMC sampling algorithms, Gibbs sampling and Metropolis Hastings sampling, with rejection sampling for probabilistic logic programs. In particular, we analyse the relation between execution time and number of samples and how fast each algorithm converges.
A Comparison of MCMC Sampling for Probabilistic Logic Programming