Changes

Jump to navigation Jump to search
no edit summary
* Count parents of each node to figure out size of conditional probability tables
* If use improper ordering, results in valid representation of joint probabilty funtion, but would require producing conditional probability tables which aren't natural/difficult to obtain experimentally. could also result in inflation of conditional tables / size of table representation is large compared to others
 
===Incremental Network Construction===
# Choose the set of relevant set of variables X that describe the domain
## Set Parents(X) to some minimal set of existing of existing nodes such that the conditional independence is satisfied
## Define the conditional probability table
 
===inferences using belief networks===
* diagnostic inferences (from effects to causes
* intercausal inferences
* mixed inferences
 
 
==Information entropy - the measure of uncertainty==
* Information - the reduction in uncertainty derived from learning learning an outcome.
* The uncertainty contained in a probability distribution is the average log-probability of an event
* Information entropy <math>H(p) = - E log( p_i ) = - \sum_{i=1}^{n}p_i log(p_i)</math>
* H= log(#of outcomes/states)
** n different possible events
** each event i
* The measure of uncertainty decreases from 0.61 to 0.06 when the probabilities are p1=0.01 and p2=0.99. There's much less uncertainty on any given day.
* Maximum entropy - given what we know, what is the least surprising distribution
* Conditional entropy
** <math>H(Y|X) \sum \limits_{x \foreach X} Pr(x) H(Y|X=x)</math>
===Divergence===
* H(p,q) is not equal to H(q,p). E.g., there is more uncertainty induced by using Mars to predict Earth, than vice versa. The reason is that going from Mars to Earth, Mars has so little water on its surface that we will be very surprised we most likely land on water on Earth
* If we use a distribution with high entropy to approximate an unknown distribution of true events, we will reduce the distance to the truth and therefore the error.
* Cross-entropy = entropy + KL Divergence
 
==Mutual Information==
* [https://www.youtube.com/watch?v=U9h1xkNELvY Youtube vid]
2,466

edits

Navigation menu