BeschreibungGradient descent Hamiltonian Monte Carlo comparison.gif
English: In Bayesian statistics, two classes of technique are commonly used to reconstruct unobserved parameters based on observed data. This plot shows the application of each to a two-dimensional toy problem.
The blue triangle shows maximum a posteriori estimation, in which an optimization algorithm such as gradient descent is used to find the set of parameters that maximizes the posterior probability density. Starting from an arbitrary guess, the triangle eventually arrives at the optimal point through gradient descent, and this set of parameters is accepted as the best answer.
The red circles show Hamiltonian Monte Carlo, in which a physics simulation is used to sample the posterior probability distribution. Starting from an arbitrary guess, the simulation stochastically travels to a variety of likely points, which are all accepted as plausible answers.
The two axes of the plot represent two coupled parameters. The shading and contours represent the posterior probability distribution, where white is lower and green is higher.
Die Person, die das Werk mit diesem Dokument verbunden hat, übergibt dieses weltweit der Gemeinfreiheit, indem sie alle Urheberrechte und damit verbundenen weiteren Rechte – im Rahmen der jeweils geltenden gesetzlichen Bestimmungen – aufgibt. Das Werk kann – selbst für kommerzielle Zwecke – kopiert, modifiziert und weiterverteilt werden, ohne hierfür um Erlaubnis bitten zu müssen.
http://creativecommons.org/publicdomain/zero/1.0/deed.enCC0Creative Commons Zero, Public Domain Dedicationfalsefalse
Kurzbeschreibungen
Ergänze eine einzeilige Erklärung, was diese Datei darstellt.
An animation comparing maximum likelihood estimation with Hamiltonian Monte Carlo in two dimensions