Cards (35)

  • Random Variable X
    Variable with a fixed domain Val(X), which represents some aspect of the system's world.
  • Variable Types :
    • Boolean {false,true}
    • Discrete variable --> categorial
    • Continuous variables --> numerical
  • variables are

    the basic atomic building blocks of our models and world representation
  • Event : a fixed assignment of values to some or all the variables in a systems world
  • Atomic Event : event where all random variables in the system' s world have a specific value assigned
  • An atmoic event corresponds to one particular possible state of the world
  • event corresponds to a set of possible states of the world
  • possible atomic events = Pi (possible values for every variable)
  • A probability distribution over S is a function P : S -> R that satisfies :
    1. P(α)0,αSP(\alpha) \geq 0 , \forall \alpha \in S
    2. P(Ω)=P( \Omega) =1,Ω=1 , \Omega = the disjuntion of all possible events
    3. if α,βS,αβ=\alpha, \beta \in S , \alpha \cap \beta =,P(αβ)= \empty , P(\alpha \cup \beta ) =P(α)+ P(\alpha) +P(β) P(\beta)
  • A probability P(α)P(\alpha) is the value that the probability distribution P assigns to the specific event α\alpha
  • Frequentist interpretation : the probability of an event is the proportion of times that the event alpha would occur if we repeated the experiment an infinite number of times
  • Subjectivist interpretaion : the probability of an event expresses a subjective degree of belief that the event alpha will hapen
  • we use subjectivist (Bayesian) interpetation : P(x) represents the system' s degree of belief that x is true in the world
  • Full Joint Distribution : the probability distribution over all atomic events possible over X
  • Marginal Distribution : A probability distribution defined over the events indcued by a subset X in X of variables
  • Marginal distribution of variable X : a probability distribution defined over the values of a single variable X in X
  • P(X=x,Y=y),P(x,y),P((X=x)(Y=y)P(X= x , Y = y) , P(x,y) , P((X = x) \cap (Y=y)
    Probability of conjunction of events
  • P(X)=P(X) =P(X1,X2,...,Xk) P(X_1, X_2, ... , X_k)
    Joint distribution over sets of variables
  • P(X,Y)=P(X,Y) =P(X1,...,Xk,Y1,...,Y1) P(X_1, ... , X_k, Y_1, ..., Y_1)
    Joint distributin over several sets of variables
  • P(XY)=P(X|Y) =P(X1,X2,...Xk)Y1,Y2,...Yl) P(X_1, X_2 , ... X_k) | Y_1, Y_2, ... Y_l)
    Conditional distribution, the joint distribution over X, conditioned on values of Y
  • A proper distribution has the sum over all entries to 1.0
  • A marginal probability is computed by summing over all entries in the full joint distribution that have X = x .
  • P(αβ)=P(\alpha | \beta) =P(αβ)P(β) \frac{P(\alpha \cap \beta)}{P(\beta)}
    Condtional Probability of alpha given that we know that beta is true
  • Conditioning
    operation that takes one distribution P(X) and returns another distribution P(X|beta)
  • The chain rule consequens :
    P(αβ)=P(\alpha \cap \beta) =P(αβ)P(β) P(\alpha | \beta) P(\beta)
    (\alpha \cap \beta) = P(\alpha | \beta) P(\beta)
    P(αβ)=P(\alpha \cap \beta)=P(βα)= P(\beta \cap \alpha) =P(βα)P(α)= P(\beta | \alpha ) P(\alpha) =P(α)P(βα) P(\alpha) P(\beta | \alpha)
  • chain rule
    P(X1,X2,...XN)=P(X_1, X_2, ... X_N) =P(X1,..,Xk)P(Xk+1,....aNX1,...Xk) P(X_1 , .. , X_k) P(X_{k+1} , .... a_N | X_1, ... X_k)
  • Law of total probability 

    P(x)=P(x) =yVal(y)P(x,y)= \sum_{y \in Val(y)} P(x,y) =yVal(Y)P(xy)P(y) \sum_{y \in Val(Y)} P(x|y)P(y)
  • Marginal distributions
    P(X)=P(X) =yP(X,y)= \sum_y P(X,y) =yP(Xy)P(y) \sum_y P(X|y)P(y)
  • Conditional distributions
    P(XZ)=P(X|Z) =yP(X,yZ)= \sum_y P(X,y|Z) =yP(Xy,Z)P(yZ) \sum_y P(X|y,Z)P(y|Z)
  • P(αβ)=P(\alpha | \beta) =P(βα)P(α)P(β) \frac{P(\beta | \alpha) P(\alpha)}{P(\beta)}
    Baye's rule
    • allows us to compute a conditional probaility P(βα)P(\beta | \alpha) from P(βα)P(\beta | \alpha)
  • P(ProblemsSymptoms)=P(Problems | Symptoms) =P(SymptomsProblem)P(Problem)P(Sympotoms) \frac{P(Symptoms |Problem) P(Problem)}{P(Sympotoms)}
    (Problems | Symptoms) = \frac{P(Symptoms |Problem) P(Problem)}{P(Sympotoms)}
    Prior Probability = P(Problem)
    Posterior probability = P(Problem | Symptoms)
    Likelihood = P(Symptoms | Problem)
    Evidence = P(Symptoms)
  • Propr probability : degree of believe in Probleme before we observed anything else
  • Posterior Probability : degree of believe in Problem alfter we have observed Symptoms
  • Likelihood : probability tiwth which Problem produces Symptoms
  • Evidence probability of these Symptoms occuring at all