Bayesian updating normal distribution
The prior information $\pi$ on $\theta$ is given by an $N(0,\tau^2)$ distribution.The whole idea is to consider the joint probability of both events, A and B, happening together (a man over 5'10" who plays in the NBA), and then perform some arithmetic on that relationship to provide a updated (posterior) estimate of a prior probability statement.Once you figure out simple examples you slowly start thinking this way about the world. Take people for example: Someone with open mind has a priori with at least slight probability assigned to unlikely (for them!We assume learners in this course have background knowledge equivalent to what is covered in the earlier three courses in this specialization: "Introduction to Probability and Data," "Inferential Statistics," and "Linear Regression and Modeling."In this week, we will discuss the continuous version of Bayes' rule and show you how to use it in a conjugate family, and discuss credible intervals.
By the end of this week, you will be able to understand and define the concepts of prior, likelihood, and posterior probability and identify how they relate to one another.
I have to calculate the posteriori distribution on $\theta$ and the Bayes estimator.
Bayesian updating normal distribution comments