• ISSN 0258-2724
  • CN 51-1277/U
  • EI Compendex
  • Scopus 收录
  • 全国中文核心期刊
  • 中国科技论文统计源期刊
  • 中国科学引文数据库来源期刊

支持向量回归中核函数和超参数选择方法综述

肖建 于龙 白裔峰

肖建, 于龙, 白裔峰. 支持向量回归中核函数和超参数选择方法综述[J]. 西南交通大学学报, 2008, 21(3): 297-303.
引用本文: 肖建, 于龙, 白裔峰. 支持向量回归中核函数和超参数选择方法综述[J]. 西南交通大学学报, 2008, 21(3): 297-303.
XIAO Jian, YU Long, BAI Yifeng. Survey of the Selection of Kernels and Hyper-parameters in Support Vector Regression[J]. Journal of Southwest Jiaotong University, 2008, 21(3): 297-303.
Citation: XIAO Jian, YU Long, BAI Yifeng. Survey of the Selection of Kernels and Hyper-parameters in Support Vector Regression[J]. Journal of Southwest Jiaotong University, 2008, 21(3): 297-303.

支持向量回归中核函数和超参数选择方法综述

基金项目: 

国家自然科学基金资助项目(60674057)

博士点基金资助项目(20060613003)

详细信息
    作者简介:

    肖建(1950- ),男,教授,博士生导师,研究方向为机器学习、计算机控制、鲁棒控制,E-mail:jxiao@nec.swjtu.edu.cn

Survey of the Selection of Kernels and Hyper-parameters in Support Vector Regression

  • 摘要: 支持向量回归(SVR)模型结构对降低经验风险和减小置信范围十分重要.为了系统深入地分析SVR模型选择方法,将现有的典型的模型选择方法分为核的选择和超参数确定,并从不同的方面对其进行了综述和评价.SVR的精确性和推广能力很大程度上依赖于核函数及超参数.提出了今后研究的方向.

     

  • VAPINK V.Statistical learning theory[M].New York:Wiley,1998.[2] SCH LKOPF B,SMOLA A J,BARTLETT P L.New support vector algorithms[J].Neural Computation,2000,12:1 207-1 245.[3] SUY(O)KENS J A K,VANDEWALLE J.Least squares support vector machine classifiers[J].Neural Processing Letters,1999,9(3):293-300.[4] SUYKENS J A K,BRANBANTER J K.Weighted least squares support vector machines:robustness and spare approximation[J].Neurocomputing,2002,48(1):85-105.[5] SCH(O)LKOPF B,SIMARDZ P Y.Prior knowledge in support vector kernels[C] ∥Advances in Neural Information Processing Systems.Cambridge:MIT Press,1998:640-646.[6] 吴涛,贺汉根,贺明科.基于插值的核函数构造[J].计算机学报,2003,26(8):990-996.Wu Tao,He Hangen,He Mingke.Interpolation based kernel function's construction[J].Chinese Journal of Computers,2003,26(8):990-996.[7] GOLD C,SOLLICH P.Model selection for support vector machine classification[J].Neurocomputing,2003,55(1-2):221-249.[8] SHAWE T J,CRISTIANINI N.Kernel methods for pattern analysis[M].London:Cambridge University Press,2004.[9] TAKIMOTO E,WARMUTH M.Path kernels and multiplicative updates[J].Journal of Machine Learning Research,2004,4(5):773-818.[10] SMITS G F,JORDAN E M.Improved SVM regression using mixtures of kernels[C] ∥Proceedings of the International Joint Conference on Neural Networks.Honolulu:Institute of Electrical and Electronics Engineers Inc.,2002,3:2 785-2 790.[11] LIU Jingxu,LI Jin,TAN Yuejin.An empirical assessment on the robustness of support vector regression with different kernels[C] ∥International Conference on Machine Learning and Cybernetics.Guangzhou:Institute of Electrical and Electronics Engineers Computer Society,2005:4 289-4 294.[12] 朱燕飞,伍建平,李琦,等.MISO系统的混合核函数LS-SVM建模[J].控制与决策,2005,20(4):417-425.ZHU Yanfei,WU Jianping,LI Qi,et al.Modeling of LS-SVM based on mixtures of kernels for MISO systems[J].Control and Decision,2005,20(4):417-425.[13] VAPNIK V,GOLOWICH S,SMOLA A.Support vector method for function approximation,regression estimation,and signal processing[J].Neural Information Processing Systems,1997,9:281-287.[14] STITSON M O,GAMMERMAN A,VAPNIK V.Support vector regression with ANOVA decomposition kernels[C] ∥Advances in Kernel Methods-Support Vector Learning.Cambridge:MIT Press,1999:285-292.[15] SMOLA A J,SCH(O)LKOPF B.The connection between regularization operators and support vector kernels[J].Neural Networks,1998,10:1 445-1 454.[16] GENTON M G.Classes of kernels for machine learning:a statistics perspective[J].Journal of Machine Learning Research,2001,2:299-312.[17] ZHANG Li,ZHOU Weida,JIAO Licheng.Wavelet support vector machine[J].IEEE Trans.on Systems,Man,and Cybernetics-Part B:Cybernetics,2004,34(1):34-39.[18] 胡丹,肖建,车畅.尺度核支持向量机及在动态系统辨识中的应用[J].西南交通大学学报,2006,41(4):460-465.Hu Dan,XIAO Jian,CHE Chang.Support vector machine with scaling kernel and its application in dynamic system identification[J].Journal of Southwest Jiaotong University,2006 4(41):460-465.[19] LANCKRIET G R,CRISTIANINI N,BARTLETT P.Learning the kernel matrix with semi-definite programming[J].Journal of Machine Learning Research,2004,5(1):27-72.[20] QIU S B,LANE T.Multiple kernel learning for support vector regression[DB/OL].(2005-12-10)[2007-12-10].http:∥www.cs.unm.edu/~treport/tr/05-12/QiuLane.[21] CHALIMOURDA A,SCH LKOPF B,SMOLA A J.Experimentally optimal ν in support vector regression for different noise models and parameters settings[J].Neural Networks,2004,17(1):127-141.[22] USTUN B,MELSSEN W J,OUDENHUIJZEN M.Determination of optimal support vector regression parameters by genetic algorithms and simplex optimization[J].Analytica Chimica Acta,2005,544(1-2):292-305.[23] MOMMA M,BENNETT B P.A pattern search for model selection of support vector regression[C] ∥Proceedings of SIAM Conference on Data Mining.Philadephia:SIAM,2002:261-274.[24] KENTARO I,RYOHEI N.Optimizing support vector regression hyperparameters based on cross-validation[C] ∥Proceedings of the International Joint Conference on Neural Networks.Portland:Institute of Electrical and Electronics Engineers Inc.,2003,3:2 077-2 082.[25] CAWLEY G C,TALBOT N L C.Improved sparse least-squares support vector machines[J].Neurocomputing,2002,48:1 025-1 031.[26] LENDASSE A,SIMON G,WERTZ V.Fast bootstrap methodology for regression model selection[J].Neurocomputing,2005,64(1-4):161-181.[27] MACKAY D J C.Bayesian interpolation[J].Neural Computation,1992,4(3):415-447.[28] GAO J B,GUNN S R,HARRIS C J.A probabilistic framework for SVM regression and error bar estimation[J].Machine Learning,2002,46:71-89.[29] LIN C J,WENG R C.Simple probabilistic predictions for support vector regression[R].Taipei:Department of Computer Science,National Taiwan University,2004.[30] LAW M H,KWOK J T.Bayesian support vector regression[C] ∥Proceedings of 8th Int.Workshop on Artificial Intelligence and Statistics.Florida:Key West,2001:239-244.[31] 阎威武,常俊林,邵惠鹤.一种贝叶斯证据框架下支持向量机建模方法的研究[J].控制与决策,2004,19(5):525-533.YAN Weiwu,CHANG Junlin,SHAO Huihe.Modeling method based on support vector machines within the Bayesian evidence framework[J].Control and Decision,2004,19(5):525-533.[32] CHU W,KEERTHI S,ONG C J.Bayesian support vector regression using a unified loss function[J].IEEE Trans.on Neural Networks,2004,15(1):29-44.[33] CRISTIANINI N,SHAWE T J.An introduction to support vector machines and other kernel-based learning methods[M].Cambridge:Cambridge University Press,2000.[34] BOUSQUET O,ELISSEEFF A.Stability and generalization[J].Journal of machine learning research,2002,2:499-526.[35] CHANG M W,LIN C J.Leave one out bounds for support vector regression model selection[J].Neural Computation,2005,17:1 188-1 222.[36] GAO J B,GUNN S R,HARRIS C J.SVM regression through variational methods and its sequential implementation[J].Neurocomputing,2003,55(1-2):151-167.[37] WANG Wenjian,XU Zongben,LU Weizhen,et al.Determination of the spread parameter in the Gaussian kernel for classification and regression[J].Neurocomputing,2003,55(3-4):643-663.[38] CHERKASSKY V,MA Y.Practical selection of SVM parameters and noise estimation for SVM regression[J].Neural Networks,2004,17(1):113-126.[39] JENG J T.Hybrid approach of selecting hyperparameters of support vector machine for regression[J].IEEE Trans.on Systems,Man,and Cybernetics-Part B:Cybernetics,2006,36(3):699-709.
  • 加载中
计量
  • 文章访问数:  2622
  • HTML全文浏览量:  89
  • PDF下载量:  814
  • 被引次数: 0
出版历程
  • 收稿日期:  2008-03-03
  • 刊出日期:  2008-06-25

目录

    /

    返回文章
    返回