论文部分内容阅读
This paper presents a new inductive learning algorithm, HGR (Version 2.0), based on the newly-developed extension matrix theory. The basic idea is to partition the positive examples of a specific class in a given example set into consistent groups, and each group corresponds to a consistent rule which covers all the examples in this group and none of the negative examples. Then a performance comparison of the HGR algorithm with other inductive algorithms, such as C4.5, OC1, HCV and SVM, is given in the paper. The authors not only selected 15 databases from the famous UCI machine learning repository, but also considered a real world problem. Experimental results show that their method achieves higher accuracy and fewer rules as compared with other algorithms.
This paper presents a new inductive learning algorithm, HGR (Version 2.0), based on the newly-developed extension matrix theory. The basic idea is to partition the positive examples of a specific class in a given example set into consistent groups, and each group等等 一个 一致 规则 所 covers all the examples in this group and none of the negative examples. Then a performance comparison of the HGR algorithm with other inductive algorithms, such as C4.5, OC1, HCV and SVM, is given in the paper . The authors not only selected 15 databases from the famous UCI machine learning repository, but also considered a real world problem. Experimental results show that their method achieves higher accuracy and fewer rules as compared with other algorithms.