Please use this identifier to cite or link to this item: https://scholarhub.balamand.edu.lb/handle/uob/2460
Title: Quadratic kernel-free non-linear support vector machine
Authors: Dagher, Issam 
Affiliations: Department of Computer Engineering 
Keywords: Support Vector Machine (SVM)
Geometrical margin
QSVM
Quadratic function
Dual optimization form
Kernel trick
Issue Date: 2008
Part of: Journal of global optimization
Volume: 41
Issue: 1
Start page: 15
End page: 30
Abstract: 
A new quadratic kernel-free non-linear support vector machine (which is called QSVM) is introduced. The SVM optimization problem can be stated as follows: Maximize the geometrical margin subject to all the training data with a functional margin greater than a constant. The functional margin is equal to WTX + b which is the equation of the hyper-plane used for linear separation. The geometrical margin is equal to 1// W//1//W// . And the constant in this case is equal to one. To separate the data non-linearly, a dual optimization form and the Kernel trick must be used. In this paper, a quadratic decision function that is capable of separating non-linearly the data is used. The geometrical margin is proved to be equal to the inverse of the norm of the gradient of the decision function. The functional margin is the equation of the quadratic function. QSVM is proved to be put in a quadratic optimization setting. This setting does not require the use of a dual form or the use of the Kernel trick. Comparisons between the QSVM and the SVM using the Gaussian and the polynomial kernels on databases from the UCI repository are shown.
URI: https://scholarhub.balamand.edu.lb/handle/uob/2460
Ezproxy URL: Link to full text
Type: Journal Article
Appears in Collections:Department of Computer Engineering

Show full item record

Record view(s)

81
checked on Nov 22, 2024

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.