
Join our Newsletter
Contact
Centre for Forecasting Tel +44.1524.592991 eMail sven dot crone (at)

The original support vector machine (SVM), introduced in 1992 [2, 19, 43],
can be characterized as a supervised learning algorithm capable of solving
linear and nonlinear classification problems. In comparison to neural networks
we may describe SVM as a feedforward neural net with one hidden layer (Fig. 6). Fig. 6: Architecture of SVM classifier with linear or nonlinear kernel function [6]. The main idea of support vector classification is to separate examples with a linear decision surface and maximize the margin between the different classes. This leads to the convex quadratic programming problem (the primal form was omitted for brevity, see for example [8]). The Lagrange multiplier measures the influence of the i’th learning example
on the functional W. Examples for which is positive are called support vectors,
as they define the separating hyperplane. C is a constant cost parameter,
controlling the number of support vectors and enabling the user to control the
tradeoff between learning error and model complexity, regarded by the margin of
the separating hyperplane [41]. As complexity is considered directly during the
learning stage, the risk of overfitting the training data is less severe for SVM.
The separation rule is given by the indicator function using the dot product
between the pattern to be classified (x), the support vectors and a constant
threshold b. Fig. 7: Nonlinear mapping from twodimensional input space with nonlinear class boundaries into a threedimensional feature space with linear separation by a hyperplane Compared to neural networks the SVM method offers a significantly smaller number of parameters. The main modelling freedom consist in the choice of a kernel function and the corresponding kernel parameters, influencing the speed of convergence and the quality of results. Furthermore, the choice of the cost parameter C is vital to obtain good classification results, although algorithmic modifications can further simplify this task [16]. 
© 20022005 BI^{3}Slab  Hamburg, Germany  All rights reserved  Questions, Comments and Enquiries via eMail  [Impressum & Disclaimer]
