论文部分内容阅读
朴素贝叶斯是一种分类监督学习方法。在理论上,应用其前提为例子的属性值独立于例子的分类属性。这个前提在实际应用中过于严格,常常得不到满足,即使是这样,在违反该前提的情况下,朴素贝叶斯学习方法仍然取得了很大的成功。近来,一种改进的朴素贝叶斯方法,增强(Boost-ing),受到广泛的关注,AdaBoost方法是其主要方法。当AdaBoost方法被用于联合几个朴素贝叶斯分类器时,其在数学上等价于一个具有稀疏编码输入,单隐层节点,sigmoid激活函数的反馈型神经网络。
Naive Bayes is a supervised learning method. In theory, the value of an attribute that uses its premises as an example is independent of the classification attribute of the example. This premise is too strict in practice and often unsatisfied. Even so, the naive Bayesian learning method has achieved great success in violation of this premise. Recently, an improved naïve Bayesian approach, Boost-ing, has received widespread attention and the AdaBoost method is its primary method. When the AdaBoost method is used to combine several naïve Bayesian classifiers, it is mathematically equivalent to a feedback neural network with sparse coding inputs, single hidden layer nodes, and sigmoid activation functions.