Library
What is "Relative Mutual Information"?
Quote 0 0
Library
The Mutual Information between X and Y, noted I(X,Y) is the difference between the marginal entropy of X and its conditional entropy given Y: I(X,Y) = H(X) - H(X|Y) The Relative Mutual Information is defined as: I_R(X,Y) = I(X,Y)/H(X) It allows expressing the mutual information between X and Y, based on the initial entropy of X, in percent.
Quote 0 0