What are some sources of uncertainty in design? (4)
Structural/material properties
Geometric properties and dimensions
Load conditions, operating conditions
Human factors
How do we deal with uncertainty in design?
Treat uncertain parameters as random variables with a probability distribution
What are Monte Carlo Methods?
▪ There is no unbiased estimator that could potentially converge faster than MC!
What is the basic process for MC?
What are the main components of MC? (6)
What are some important sampling methods?
What is a Markov chain?
A joint probability over a sequence of random variables is a Markov chain.
What is Markov chain monte Carlo?
Exploration by random walk through a Markov chain, whose stationary distribution equals the target distribution
Sample generation & computation through MC
What are two important MCMC methods?
Metropolis-Hastings
Gibbs sampling
What is the Monte Carlo Estimator formula?
\hat{\theta} = 1/N \sum_{i=1}^{\infty} f(x_i)
What are the steps for utilizing the MC Estimator?
What are three general uses for MC and one specific for engineering?
General:
- Numerical Integration
- Numerical Simulation
- Optimization
Engineering:
- Probabilistic Design
How can MC be useful in optimization?
MC methods tend to not get stuck at local minima by allowing random exits from a local minimum, thus allowing the solver to search for a different (potentially better) minimum.
Is MC biased or unbiased?
Unbiased
What are some properties of MC error?
What are the three kinds of randomness?
What are some properties of Random number generators?
What is MC inverse transform sampling?
Concept: apply random number generator on a uniform (or other very simple) distribution, then transform to the distribution we need
What is MC rejection sampling?
Idea: find a probability distribution that is close to the original distribution but easier to sample from
What is MC importance sampling?
Method to approximate expectations for a complex distribution p using a different “proposal distribution” q
What are the 5 Markov chain states?
When is a Markov Chain Irreducable?
When there is a path from any given chain to another chain.
When is a Markov Chain considered stationary?
When the transition properties do not change over time
What is a recurrent state in a Markov chain.
A state that if reached will certainly be revisited