Save
MINE
CS50s Intro to AI /w Python
Week 2 Uncertainty
Save
Share
Learn
Content
Leaderboard
Learn
Created by
Rosie W
Visit profile
Cards (19)
The sum probability of a
world
"small omega", in the set of all
possible
worlds
"big omega", should equal
1.
Unconditional
Probability: refers to degree of belief in a proposition,
without
any other data.
Example: rolling a fair die with a 1/6 probability.
Conditional
Probability: refers to degree of belief in a proposition, given that other
evidence
is provided.
Example: rolling two dice with the sum of 10, knowing one dice already rolled a 4.
Random
Variable: probability variable with a
domain
of possible values.
Example: weather = {sun, rain, snow, dry}
Probability
Distribution
Independence
: one event occurring
doesn't
affect the probability of the other
Example: rolling a 2 on one dice, does
not
make it more/less likely another die will roll a 5.
Bayes Rule
probability of b
given
a, equals the probability of b
times
the probability of a
given
b,
divided
by the probability of a
Probability Rule #1
Negation
probability of not a =
1
- probability a
Probability Rule #2
Inclusion-exclusion
probability of a
or
b = probability of a
add
probability of b,
minus
probability a
and
b
Probability Rule #4
Marginalization
of a
random
variable
Probability Rule #3
Marginalization
probability a = probability of a
and
b,
add
the probability of a and
not
b
allows for turning a
joint
probability -> into a
single
Probability Rule #5
Conditioning
probability of a = probability of a
given
b
times
probability b,
add
probability of a
given
not b
times
probability of
not
b
turns a
conditional
probability -> into a
single
Probability Rule #5b
Conditioning
of a
random
variable
Bayesian Network
: data structure representing
dependency
between
random
variables
directed graph connects
nodes
(representing random variables) with probability
P (x |
Parents
(x))
Markov Assumption
:
assumes a current
state
is only dependent on a finite number of
previous
states
Markov Chain
:
sequence of
random
variables
Computers can assume information in the
hidden state
, from
observations
Sensor Markov
Assumption:
assumes
evidence
variable depends only on the corresponding
state
Hidden Markov
Model