Calculating entropy of Naive Bayes random variables
Suppose a Naive Bayes graphical model with binary random variables is
given by $$P(y,x_1,x_2,...,x_n)=P(y)P(x_1|y)...P(x_n|y)$$
Attempting to calculate $I(x_1,...,x_n;y)$ raises the question: how can
the entropy $H(P(x_1,x_2,...,x_n))$ be calculated efficiently? There must
be a better way than computing an exponential number of entries.
No comments:
Post a Comment