Save
...
Probalistc Models
Part 5 : Learning bayesian networks
Math - part 5
Save
Share
Learn
Content
Leaderboard
Learn
Created by
Merel DJ
Visit profile
Cards (10)
ML Estimation for unconditional distributions
A)
log
B)
max
C)
derivative
D)
0
4
Maximum likelihood estimate for
A)
Bernoulli distribution
B)
binary
C)
x0
D)
x0
E)
x1
F)
occurences
6
Conditional probabilities
A)
discrete
B)
conditional
C)
xi|y
D)
events
E)
D
5
ML estimate for the parameters
A)
Normal distribution
B)
Gaussian PDF
C)
Joint Mulitvariable Normal
D)
Multivariate Gaussian PDF
4
Problem with the ML estimate :
ignores amount of
evidence
bases its decision on
likelihood
single
estimate
Bayesian Parameter estimation :
learns a probability distribution over
all
possible
parameter
values
computes
posterior
distribution
Posterior distribution combines :
strength of
evidence
subjective
expectations
P(D) is hard to calculate
integral
Bayesian paramter estimation
A)
Maximum likelihood
B)
Maximum A posteriori
C)
posterior mean
3
How to choose a final single estimate θ
A)
Maximum Likelihood
B)
Maximum A Posteriori
C)
Posterior Mean
3
Bayesian
Model averaging :
result of parameter estimation a probabiliy
distributino
over all possible θ
we could use all possible parameter settings to perform inference
"true
bayesian
approach" -> to complicated