Averaged one-dependence estimators

Algorithm

The averaged one-dependence estimators or AODE algorithm works in the same way as a Naive Bayesian classifier, but it allows for mutual dependence between pairs of values within the input vector while continuing to ignore more complicated dependency relationships involving three or more values. The algorithm thus relaxes somewhat the ‘naivety’ of the Naive Bayesian classifier.

AODE performs well with a large number of (training or input) data items. However, because all pairs of input values are considered in a combinatorial fashion, it is not feasible to use the AODE algorithm with high-dimensional vectors (each data item has many input values). Where there are many input values, it can make sense to only use dependence estimators in those cases where interdependence is proven or at least suspected.

In the weightily averaged one-dependence estimators or WAODE algorithm, the contribution of each input vector variable to the model is weighted according to how well that variable classifies the data on its own. WAODE has been found to yield significantly better results than simple AODE.

alias
AODE
subtype
Weightily averaged one-dependence estimators WAODE
has functional building block
FBB_Classification
has input data type
IDT_Vector of categorical variables IDT_Vector of quantitative variables
has internal model
INM_Probability
has output data type
ODT_Classification
has learning style
LST_Supervised
has parametricity
PRM_Parametric
has relevance
REL_Relevant
uses
sometimes supports
mathematically similar to