99
Views
3
CrossRef citations to date
0
Altmetric
Original Articles

Bayesian network classifiers for probability-based metrics

&
Pages 477-491 | Received 19 Jan 2012, Accepted 02 Dec 2012, Published online: 14 Jun 2013

References

  • Aha, D. (1992). Tolerating noisy, irrelevant and novel attributes in instance-based learning algorithms. International Journal of Man-Machine Studies, 36, 267–287.
  • Aha, D., Dennis, K., & Albert, M. K. (1991). Instance-based learning algorithms. Machine Learning, 6, 37–66.
  • Atkeson, C. G., Moore, A. W., & Schaal, S. (1997). Locally weighted learning. Artificial Intelligence Review, 11(1–5), 11–73.
  • Bennett, P. N. (2000). Assessing the calibration of naive Bayes' posterior estimates. Technical Report No. CMU-CS00-155.
  • Blanzieri, E., & Ricci, F. (1999). Probability-based metrics for nearest neighbor classification and case-based reasoning. Lecture Notes in Computer Science, 1650, 14–28.
  • Chickering, D. M. (1996). Learning Bayesian networks is NP-complete. In D.Fisher & H.Lenz (Eds.), Learning from data: Artificial intelligence and statistics V (pp. 121–130). New York: Springer Press.
  • Chow, C. K., & Liu, C. N. (1968). Approximating discrete probability distributions with dependence trees. IEEE Transaction on Information Theory, 14, 462–467.
  • Cleary, J. G., & Trigg, L. E. K. (1995). An instance-based learner using an entropic distance measure. Proceedings of the Twelfth International Machine Learning Conference (pp. 108–114). Tahoe City, CA: Morgan Kaufmann.
  • Cost, S., & Salzberg, S. (1993). A weighted nearest neighbor algorithm for learning with symbolic features. Machine Learning, 10, 57–78.
  • Cover, T. M., & Hart, P. E. (1967). Nearest neighbor pattern classification. Institute of Electrical and Electronics Engineers Transactions on Information Theory, 13(1), 21–27.
  • Domingos, P., & Pazzani, Michael J. (1997). On the optimality of the simple Bayesian classifier under zero-one loss. Machine Learning, 29, 103–130.
  • Frank, E., Hall, M., & Pfahringer, B. (2003). Locally Weighted naive bayes. Proceedings of the Conference on Uncertainty in Artificial Intelligence (pp. 249–256). Acapulco, Mexico: Morgan Kaufmann.
  • Friedman, N., Geiger, D., & Goldszmidt, M. (1997). Bayesian network classifiers. Machine Learning, 29, 131–163.
  • Grossman, D., & Domingos, P. (2004). Learning Bayesian network classifiers by maximizing conditional likelihood. Proceedings of the Twenty-First International Conference on Machine Learning (pp. 361–368). Banff: ACM Press.
  • Guo, Y., & Greiner, R. (2005). Discriminative model selection for belief net structures. Proceedings of the Twentieth National Conference on Artificial Intelligence, AAAI2005 (pp. 770–776). Pittsburgh, Pennsylvania, USA: AAAI Press.
  • Jiang, L. (2011). Learning random forests for ranking. Frontiers of Computer Science in China, 5(1), 79–86.
  • Jiang, L. (2011). Random one-dependence estimators. Pattern Recognition Letters, 32(3), 532–539.
  • Jiang, L., Zhang, H., & Cai, Z. (2009). A novel Bayes model: Hidden naive Bayes. IEEE Transactions on Knowledge and Data Engineering, 21(10), 1361–1371.
  • Jiang, L., Zhang, H., Cai, Z., & Wang, D. (2012). Weighted average of one-dependence estimators. Journal of Experimental and Theoretical Artificial Intelligence, 24(2), 219–230.
  • Jiang, L., Cai, Z., Wang, D., & Zhang, H. (2012). Improving tree augmented naive Bayes for class probability estimation. Knowledge-Based Systems, 26(1), 239–245.
  • Jiang, L., Cai, Z., Zhang, H., & Wang, D. (2012). Not so greedy: Randomly selected naive Bayes. Expert Systems with Applications, 39(12), 11022–11028.
  • Kohavi, R. (1996). Scaling up the accuracy of naive-Bayes classifiers: A decision-tree hybrid. Proceedings Second International Conference on Knowledge Discovery and Data Mining (KDD 96) (pp. 202–207). Portland, Oregon, USA.
  • Langley, P., & Sage, S. (1994). Induction of selective Bayesian classifiers. Proceedings 10th Conference On Uncertainty in Artificial Intelligence (pp. 339–406). Seattle, Washington, USA.
  • Ling, C., & Yan, R. (2003). Decision tree with better ranking. Proceedings of the Twentieth International Conference on Machine Learning (pp. 480–487). Washington, DC, USA: AAAI Press.
  • Merz, C., Murphy, P., & Aha, D. (1997). UCI repository of machine learning databases. Irvine: Department of ICS, University of California.
  • Myles, J. P., & Hand, D. J. (1990). The multi-class metric problem in nearest neighbor discrimination rules. Pattern Recognition, 23(11), 1291–1297.
  • Nadeau, C., & Bengio, Y. (2003). Inference for the generalization error. Machine Learning, 52(3), 239–281.
  • Provost, F. J., & Domingos, P. (2003). Tree induction for probability-based ranking. Machine Learning, 52(3), 199–215.
  • Short, R. D., & Fukunaga, K. (1981). The optimal distance measure for nearest neighbour classification. IEEE Transactions on Information Theory, 27, 622–627.
  • Stanfill, C., & Waltz, D. (1986). Toward memory-based reasoning. Communications of the ACM, 29, 1213–1228.
  • Tunkelang, D. (2002). Making the nearest neighbor meaningful. Proceedings of SIAM Workshop on Clustering High Dimensional Data and its Applications. Arlington, VA, USA.
  • Wang, B., & Zhang, H. (2007). Probability-based metrics for locally weighted naive Bayes. Proceedings of the 20th Canadian Conference on Artificial Intelligence (pp. 180–191). Montreal, Canada: Springer.
  • Webb, G. I., Boughton, J., & Wang, Z. (2005). Not so naive Bayes: Aggregating one-dependence estimators. Machine Learning, 58, 5–24.
  • Wilson, D. Randall, & Martinez, Tony R. (1997). Improved heterogeneous distance functions. Journal of Artificial Intelligence Research, 6(1), 1–34.
  • Witten, I. H., & Frank, E. (2005). Data mining: Practical machine learning tools and techniques (2nd ed.). pp. 2005. San Francisco: Morgan Kaufmann.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.