Block-Wise Kernel Partial Least-Squares Method
-
摘要: 针对核偏最小二乘法(KPLS)随核函数矩阵维数膨胀而计算量增加的问题,提出分块核偏最小二乘法(BKPLS).BKPLS根据核函数矩阵对称的性质,将KPLS中的批量算法转变成分块算法,不但减小了对计算机硬件的要求,而且减少了计算时间.仿真结果验证了BKPLS的有效性,而且在样本数量巨大,KPLS无法实现的情况下,BKPLS也能保证辨识算法的实现.Abstract: A method of block-wise kernel partial least-squares(BKPLS) was proposed to solve the problem of dimensional explosion of kernel matrix in batch-wise kernel partial least-squares(KPLS).In the BKPLS,the batch-wise method in the KPLS is transformed into a block-wise method based on the symmetric character of kernel matrix,so that it relaxes the requirement on computer hardware and decreases calculation time.Simulation result verifies that the BKPLS reduces calculation time effectively,and can implement the identification algorithm for the sample with a great size while KPLS fails.
-
Key words:
- KPLS /
- kernel matrix /
- dimensional explosion /
- BKPLS /
- calculation time
-
王惠文.偏最小二乘回归方法及其应用[M].北京:国防工业初版社,1999.[2] BAI Yifeng,XIAO Jian,YU Long.Kernel partial least-squares regression[C]//Proceedings of International Joint Conference on Neural Networks (IJCNN) 2006.Vancouver:IEEE Press,2006:1 231-1 238.[3] TAYLOR J S,CRISTIANINI N.Kernel methods for pattern analysis[M].Cambridge University,2004.[4] ROSIPAL R,TREJO L J,MATTHEWS B.Kernel PLS-SVC for linear and nonlinear classification[C]//Proceedings of ICML.Washington:American Association for Artificial Intelligence Press,2003:640-647.[5] ROSIPAL R,TREJO L J.Kernel partial least squares regression in reproducing kernel Hilbert space[J].Journal of Machine Learning Research,2001,2:97-123.[6] VPANIK V.The nature of statistical learning theory[M].New York:Springer-Verlag,1995.[7] SCHOLKOPF B.Nonlinear component analysis as a kernel eigenvalue problem[J].Neural Computation,1998,10(5):1 299-1 319.[8] QIN S J.Recursive PLS algorithms for adaptive data modeling[J].Computers chem.Engng.,1998,22(4-5):503-514.
点击查看大图
计量
- 文章访问数: 1304
- HTML全文浏览量: 74
- PDF下载量: 516
- 被引次数: 0