论文部分内容阅读
在高维特征空间中,具有支持向量机形式的学习机的决策超平面倾向于通过原点,并不需要偏置.但在-支持向量回归机(ν-SVR)中存在偏置,为了研究偏置在ν-SVR中的作用,提出了无偏置的ν-SVR优化问题并给出其求解方法.在标准数据集上的实验表明,无偏置ν-SVR的泛化性能好于ν-SVR.根据对偶优化问题的解空间分析,偏置不应包含在-SVR优化问题中,ν-SVR的决策超平面在高维特征空间中应通过原点.
In the high-dimensional feature space, the decision hyperplane with learning machine in the form of support vector machine tends to pass through the origin without bias, but there is bias in support vector machine (ν-SVR) And put it in the solution of ν-SVR without bias.Experiments on a standard dataset show that the generalized performance of unbiased ν-SVR is better than ν- SVR. According to the solution space analysis of dual optimization problem, the bias should not be included in the SVR optimization problem. The ν-SVR decision hyperplane should pass through the origin in the high-dimensional feature space.