lhoffmeyer
Occasionally when I run the Network Performance of the Target node I receive the message that "Posterior Probability Distribution of the Target node is uniform ..."

What does this mean and what is it's implications for the R2 and Overall Precision of the model?

TIA,

LCH
Quote 0 0
Dan
You got this message because the posterior distribution of your target node is uniform, and therefore, BayesiaLab cannot choose between the states for building the Occurrence Matrix. This uniform distribution can be a correct one (i.e. 1/n, where n is the number of states), or a distribution with only 0 that indicates that inference is impossible with this set of evidence. This is usually the latter case, when your network has learned some deterministic relationships that do not hold in the test set. In order to prevent this, you just have to use the "Smooth Probability Estimation" while learning your network. This will add some virtual samples in your dataset to represent a non informative prior stating that "everything is possible". This option is almost mandatory when you plan to use your network on a population that has not been used for learning.
Quote 0 0
sgfrazie
Dan, if I understand you correctly due to a posterior distribution that is uniform, BayesiaLab would build the Occurrence Matrix (part of the Confusion Matrix) that is associated the Analysis -> Network Performance with all values equally distributed - essentially no  prediction capability? Essentially get the red values shown in the occurrence matrix below?
Confusion.jpg 
Quote 0 0
Dan
In order to build the occurrence matrix, BayesiaLab needs to impute a target state value for each of the 999 sets of observations described in your dataset. The imputation is based on the posterior distribution of the target, i.e. the distribution once all the evidence described in a row have been set. By default, BayesiaLab selects the state with the highest posterior probability. When the posterior distribution is uniform (e.g. 1/3, 1/3, 1/3 in your example, or 0, 0, 0 when the set of evidence is not compatible with the joint probability distribution represented by the Bayesian network), BayesiaLab cannot use this rule any more. Therefore, you are prompted to either manually select a state or it will randomly choose one state. 

Note that in your modified screenshot, you are assuming that your model always returns a uniform distribution! If it were the case, it would be a totally useless model.

Getting a real uniform posterior distribution is not so likely. However, getting an incompatible evidence set that returns a uniform distribution with a 0 probability associated with all the states is quite common when the learning set is small. Using the Smooth Probability Distribution allows preventing to have these false deterministic relationships (inferred with too few observations). 
Quote 0 0