TY - GEN
T1 - L1-Norm Support Vector Regression in Primal Based on Huber Loss Function
AU - Puthiyottil, Anagha
AU - Balasundaram, S.
AU - Meena, Yogendra
N1 - Publisher Copyright:
© 2020, Springer Nature Switzerland AG.
PY - 2020
Y1 - 2020
N2 - Support vector regression (SVR) method becomes the state of the art machine learning method for data regression due to its excellent generalization performance on many real-world problems. It is well-known that the standard SVR determines the regressor using a predefined epsilon tube around the data points in which the points lying outside the tube contribute to the errors whereas the inside points are simply ignored. To measure the data misfit as stated, the epsilon insensitive function is introduced as a loss function. In comparison with the popular quadratic loss function, it is robust but only continuous and therefore numerical minimization is difficult. However, as a combination of robust treatment to large errors and showing quadratic treatment to small errors, Huber function is used in the literature to measure the data misfit having the smooth property that it is differentiable everywhere. In this study, we propose a novel robust Huber SVR (HSVR) formulation in primal where the regressor is made as flat as possible by considering the regularization term in L1-norm. Since the regularization term is non-smooth and therefore by replacing it with smooth approximation functions, new problem formulation is obtained which is solved then by functional iterative method. Tests were performed on few synthetic and real world data sets whose results confirm the suitability and effectiveness of the proposed robust model.
AB - Support vector regression (SVR) method becomes the state of the art machine learning method for data regression due to its excellent generalization performance on many real-world problems. It is well-known that the standard SVR determines the regressor using a predefined epsilon tube around the data points in which the points lying outside the tube contribute to the errors whereas the inside points are simply ignored. To measure the data misfit as stated, the epsilon insensitive function is introduced as a loss function. In comparison with the popular quadratic loss function, it is robust but only continuous and therefore numerical minimization is difficult. However, as a combination of robust treatment to large errors and showing quadratic treatment to small errors, Huber function is used in the literature to measure the data misfit having the smooth property that it is differentiable everywhere. In this study, we propose a novel robust Huber SVR (HSVR) formulation in primal where the regressor is made as flat as possible by considering the regularization term in L1-norm. Since the regularization term is non-smooth and therefore by replacing it with smooth approximation functions, new problem formulation is obtained which is solved then by functional iterative method. Tests were performed on few synthetic and real world data sets whose results confirm the suitability and effectiveness of the proposed robust model.
UR - https://www.scopus.com/pages/publications/85075277830
UR - https://www.scopus.com/pages/publications/85075277830#tab=citedBy
U2 - 10.1007/978-3-030-30577-2_16
DO - 10.1007/978-3-030-30577-2_16
M3 - Conference contribution
AN - SCOPUS:85075277830
SN - 9783030305765
T3 - Lecture Notes in Electrical Engineering
SP - 195
EP - 205
BT - Proceedings of ICETIT 2019 - Emerging Trends in Information Technology
A2 - Singh, Pradeep Kumar
A2 - Panigrahi, Bijaya Ketan
A2 - Suryadevara, Nagender Kumar
A2 - Sharma, Sudhir Kumar
A2 - Singh, Amit Prakash
PB - Springer
T2 - 1st International Conference on Emerging Trends in Information Technology, ICETIT 2019
Y2 - 21 June 2019 through 22 June 2019
ER -