This thread will be the first of many I would assume. It specifically will focus on my exploration of different means of creating computational models that I'm using or planning on using for other areas of interest.
Monte Carlo Integration
Monte Carlo integration is merely a means to solve some integral I(f) via the approximation
by means of using stochastic variables (random sampling), where Ω is a sample space containing the random variables (X,_1, ..., X_m) and f is the density.
That is,
We will approximate this integral by finding the empirical average via the Strong Law of Large Numbers.
The empirical average is computed as,
which converges to E_f[h(x)] hence it can be used as an approximation.
The generalized expected value is,
I tested test the Monte Carlo integration on the function h(x) = (cos50x + sin20x)^2 whose graph can be found below.
To approximate the integral I used the N(0,1) as my random variable and this outputted the following estimation and its error,
The approximation graphed,
We are also interested in knowing the speed in which h_m converges to E_f[h(x)].
The rate of convergence be found by evaluating the approximation variance,
Which can be approximated from the random sample (X_1,…,X_m) via
By the Central Limit Theorem for large m,
The variance simplified:
We then can map the variance of our approximation which is signaled by the red lines,
The standard deviation is proportional to,
Hence, m must be quadrupled in order to reduce the variance by half.
I explored the proportionality between m and the variance through the approximation of Pi
m = 10,000
m=100,000
m=1,000,000