For example, in the binomial and poisson models, we have seen a few examples of Bayesian inferences where the posterior distribution only affects one parameter. We have also practiced group comparison examples. Additionally, we have discussed various methods for obtaining or approximating the posterior and have worked on a few examples where we simulated posterior samples from the posterior using techniques like grid approximation and MCMC. The simulation method will be discussed more conceptually in this lecture, along with the need for Markov Chain Monte Carlo and how to apply it to problems with multiple parameters. We will go over four MCMC techniques that are frequently used in Bayesian literature.
2 Answers 2
The acceptance rate depends largely on the proposal distribution. The probability ratio between the current point and the proposal will always be close to 1, so there is a high chance that it will be accepted if it has little variance. The probability of two nearby points being similar (informally) is due to the target probability densities we typically work with being locally Lipschitz (a type of smoothness) at small scales.
The proposals will have a less than one acceptance probability if your current sample is close to the MAP value, though it may still be very close to 1.
As a side note, standard practice is to tune the proposal distribution to get around a 0.2-0.25 acceptance rate. See here for a discussion of this.
When simulating from the exact target, an easy example of acceptance probability equal to one can be seen: in that case, $$dfracpi(x)q(x,x)pi(x)q(x,x)=1qquadforall x,x$$. However, a real-world example is the Gibbs sampler, which can be seen as a series of Metropolis-Hastings steps, all with probability one.
The Metropolis-Hastings algorithm’s potential misinterpretation as an optimization algorithm may be the cause of your confusion. While not aiming for the maximum, the algorithm spends more iterations on higher target regions. Since the proposal values $q(xtextMAP,x)$ and $q(x,xtextMAP)$ also matter, values with lower target values may still be accepted even though $pi(xtextMAP)gepi(x)$ for all $x$s.
Thanks for contributing an answer to Cross Validated!
- Please be sure to answer the question. Provide details and share your research!.
- Asking for help, clarification, or responding to other answers.
- Making claims based on opinion; supporting them with evidence from sources or personal experience
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers. Draft saved Draft discarded.
Sign up or log in Sign up using Google Sign up using Email and Password
Required, but never shown
Post as a guest Name Email
Required, but never shown
Not the answer you’re looking for? Browse other questions tagged markov-chain-montecarlometropolis-hastings or ask your own question.
Copy and paste this URL into your RSS reader to sign up for this RSS feed.
Fields Institute CMPT898 Lec19: Acceptance Rates and Convergence in MCMC & PMCMC
What is a good acceptance rate for MCMC?
Lastly, the acceptance rate depends on the problem but typically for 1-d problems, the acceptance rate should be around 44% (around 23% for more than 5 parameters)
What is an ideal acceptance rate?
If an offer acceptance rate is greater than 90%, it may be a sign that the expectations of the chosen candidates and the needs of the employer are closely aligned.
How to choose proposal distribution in MCMC?
Instead of using Q(x), MCMC algorithms use Q(x | x) for proposal distribution. This process therefore generates a Markov chain from samples x(1),x(2). Metropolis-Hastings is one of the most widely used MCMC methods, allowing us to specify any proposal Q(x | x). However, picking a good Q(x | x) requires care.
How does MCMC work?
In order to sample from a probability distribution, Markov Chain Monte Carlo (MCMC) methods build a Markov chain with the desired distribution as its stationary distribution. The desired distribution is then sampled from the chain’s state after a number of steps.