The purpose of this thesis has been to test the Support Vector Machine (SVM) based classifier on hyperspectral data by using quadratic optimizing theory. This has been done by solving the quadratic program C-SVM by using the central path method.
The thesis compares the performance of the central path method with other commonly known methods such as matlab's quadprog function, the on-line algorithm and a simple algorithm called center algorithm. They all solves the C-SVM problem or related problems.
By also taking advantage of the power in kernel functions C-SVM becomes a powerful tool for classifying hyperspectral data. The kernel functions transform feature space in to an higer dimention where the data can be separated by a linear discriminant function. By introducing a loss tolerance C, the classifier also generalizes to non-linear separable cases.
For classifying hyperspectral data this is a promising method because it generalizes well from a training and validation set to an unseen test set.