Learning Vector Quantization (LVQ) is a supervised learning algorithm used to classify data by finding representative vectors for each class.
LVQ is used in scenarios where the goal is to classify data points into distinct categories based on pre-classified training data. The pre-classified data is used to train the model, allowing it to learn the relationships and patterns within the data. Once trained, the model can then be used to classify new data points that do not have labels. It is particularly useful when a minimal reference set for classification is needed, such as in the nearest neighbour algorithm.
The algorithm operates in a multidimensional space containing pre-classified training data. Movable vectors are placed in this space, distributed among the classes present in the training data. These vectors can be placed randomly or based on some initial theory. During training, a data item is chosen at random, and the class of the data item is compared with the class of the closest movable vector. If the classes match, the vector is moved towards the data item; if they do not, the vector is moved away. The goal is for the vectors to end up at the centers of their respective classes.
For example, if you have a dataset with different types of flowers, LVQ can help classify new flowers by adjusting the positions of the vectors to best represent each flower type.
The main advantage of LVQ is its simplicity and the intuitive nature of its vector adjustments. However, it may not perform well if the classes are not clearly separated in the vector space, and other algorithms might be more effective in such cases.
LVQ is related to concepts like “Nearest Neighbour” and “Supervised Learning”.
- Alias
- LVQ
- Related terms
- Nearest Neighbour Supervised Learning