Cards (10)

  • ML Estimation for unconditional distributions
    A) log
    B) max
    C) derivative
    D) 0
  • Maximum likelihood estimate for
    A) Bernoulli distribution
    B) binary
    C) x0
    D) x0
    E) x1
    F) occurences
  • Conditional probabilities
    A) discrete
    B) conditional
    C) xi|y
    D) events
    E) D
  • ML estimate for the parameters
    A) Normal distribution
    B) Gaussian PDF
    C) Joint Mulitvariable Normal
    D) Multivariate Gaussian PDF
  • Problem with the ML estimate :
    • ignores amount of evidence
    • bases its decision on likelihood
    • single estimate
  • Bayesian Parameter estimation :
    • learns a probability distribution over all possible parameter values
    • computes posterior distribution
  • Posterior distribution combines :
    • strength of evidence
    • subjective expectations
    • P(D) is hard to calculate integral
  • Bayesian paramter estimation
    A) Maximum likelihood
    B) Maximum A posteriori
    C) posterior mean
  • How to choose a final single estimate θ
    A) Maximum Likelihood
    B) Maximum A Posteriori
    C) Posterior Mean
  • Bayesian Model averaging :
    • result of parameter estimation a probabiliy distributino over all possible θ
    • we could use all possible parameter settings to perform inference
    • "true bayesian approach" -> to complicated