Averaged One-dependence Estimators

Algorithm

The averaged one-dependence estimators (AODE) algorithm is an extension of the Naive Bayesian classifier that allows for mutual dependence between pairs of values within the input vector.

It works similarly to a Naive Bayesian classifier but relaxes the ‘naivety’ by considering dependencies between pairs of input values while ignoring more complex dependencies involving three or more values. This makes AODE more flexible and potentially more accurate than the Naive Bayesian classifier.

AODE performs well with a large number of training or input data items. However, because all pairs of input values are considered in a combinatorial fashion, it is not feasible to use the AODE algorithm with high-dimensional vectors where each data item has many input values. In such cases, it can be beneficial to use dependence estimators only when interdependence is proven or at least suspected.

The weightily averaged one-dependence estimators (WAODE) algorithm is a variant of AODE where the contribution of each input vector variable to the model is weighted according to how well that variable classifies the data on its own. WAODE has been found to yield significantly better results than simple AODE.

AODE and WAODE are important because they provide a balance between the simplicity of the Naive Bayesian classifier and the complexity of models that consider all possible dependencies. They are used in various classification tasks where it is important to account for dependencies between input variables without introducing excessive computational complexity.

Alias
AODE
Related terms
Naive Bayesian Classifier Classification Supervised Learning