Continuing on from experiments last week with Metropolis Hastings sampling for probability distributions I decided to implement Accept-Reject sampling. Accept-Reject seems to deliver well distributed results faster than Metropolis Hastings does but in the long run can easily introduce biased samples if the proposal distribution q(x) is not well tuned to the specific probability distribution function. MH on the other hand, is far more resilient to an improperly chosen proposal distribution as it will still converge to an unbiased result in most cases, albeit by taking a longer time to converge than it normally would. Accept-Reject also seems to have issues towards the truncations of it’s function causing under-sampling of values close to the lower and upper bounds.
A little update on what I’ve been learning about lately.
Metropolis-Hastings is an algorithm for sampling random values out of a probability distribution. Effectively, for a function f(x) you wish to simulate where the curve is higher it is more probable that a random number will be chosen from there. Uniform sampling of the range would mean that all x values on the graph have an even probability of being chosen, this could be simulated by saying f(x) = 1.
However, if the function f(x) represents a curve then the number of samples in high energy (probability) regions needs to be more dense. This can be seen in the simulations of f(x) = sqrt(x-1) and f(x) = sin(1/x) below.
Metropolis-Hastings is best suited to problems where Direct Sampling and other more efficient solutions are not available. Because it only relies on the availability of a computable function f(x) and a proposal density Q(x|x*) M-H can correctly sample unusual probability distributions that would else-wise be difficult or impossible to compute. This can be seen below in the simulations of the sine function f(x) = sin(x) + 1.5.
Matlab code for Metropolis-Hastings Monte Carlo Integration (for generating samples)
Below is the Matlab code used to generate the above graph. By modifying the function f(x), the sample count N, the proposal distribution and it’s variance qV, and the upper and lower truncation bounds LB & HB any PDF can be simulated.
Work continues and it’s going well! Over the last week I have been teaching myself Bayesian Statistics which allows you to think about probability in terms of degrees of belief. It is also predicated on the concept of parameters, or prior known data, allowing you to update the probability of something happening in the future. This allows for convergent solutions on unknown probabilities which useful in graphics in terms of being able to fully sample arbitrary probability distributions.
I joke. So far it’s just been a lot of reading papers on graphics, most of which I do not understand. :(
Anyway, as a fun little side project I’ve been working on a 3D Ray Caster using my old favourites, OpenCL, OpenGL, and C++. It’s quite similar in concept to the renderer for my dissertation project last year but with a simplified rendering method and faster performance.
The goal for this project is to revisit Voxel Rendering which I played around with over the summer, and possibly to revisit game development with a new version of my Aliens First-Person Pacman game.
Currently the program has seven hardcoded Axis Aligned Bounding Boxes (AABB’s) which it renders as the camera orbits around them. I’m working on a method to organise AABB’s into a flat packed Oct-Tree which can be passed to the GPU. Once this is working it should be trivial to construct an AABB Oct-Tree of the CT Scanned skull I used before, or to construct a simple game.
Another thing I may look into is modifying the Aliens game to have textured floors and ceilings using floor-casting and possibly to have maps with multiple vertical levels.