Table of Contents Author Guidelines Submit a Manuscript
Journal of Applied Mathematics
Volume 2012, Article ID 258054, 13 pages
http://dx.doi.org/10.1155/2012/258054
Research Article

Applying Randomness Effectively Based on Random Forests for Classification Task of Datasets of Insufficient Information

Division of Computer and Information Engineering, Dongseo University, Busan 617-716, Republic of Korea

Received 20 July 2012; Revised 8 October 2012; Accepted 8 October 2012

Academic Editor: Hak-Keung Lam

Copyright © 2012 Hyontai Sug. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. B. Evans and D. Fisher, “Using decision tree induction to minimize process delays in printing industry,” in Handbook of Data Mining and Knowledge Discovery, W. Klösgen and J. M. Żytkow, Eds., pp. 874–880, Oxford University Press, 2002. View at Google Scholar
  2. B. Evans and D. Fisher, “Overcoming process delays with design tree induction,” IEEE Expert, vol. 9, no. 1, pp. 60–66, 1994. View at Publisher · View at Google Scholar · View at Scopus
  3. V. G. Kaburlasos and V. Petridis, “Fuzzy Lattice Neurocomputing (FLN) models,” Neural Networks, vol. 13, no. 10, pp. 1145–1170, 2000. View at Publisher · View at Google Scholar · View at Scopus
  4. A. Cripps, V. G. Kaburlasos, N. Nguyen, and S. E. Papadakis, “Improved experimental results using Fuzzy Lattice Neurocomputing (FLN) classifiers,” in Proceedings of the International Conference on Machine Learning; Models, Technologies and Applications (MLMTA '03), pp. 161–166, Las Vegas, Nev, USA, June 2003. View at Scopus
  5. P. Panov and S. Džeroski, “Combining bagging and random subspaces to create better ensembles,” Lecture Notes in Computer Science, vol. 4723, pp. 118–129, 2007. View at Publisher · View at Google Scholar · View at Scopus
  6. T. K. Ho, “The random subspace method for constructing decision forests,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 20, no. 8, pp. 832–844, 1998. View at Publisher · View at Google Scholar · View at Scopus
  7. L. Breiman, “Random forests,” Machine Learning, vol. 45, no. 1, pp. 5–32, 2001. View at Publisher · View at Google Scholar · View at Scopus
  8. J. R. Quinlan, C4.5: Programs for Machine Learning, Morgan Kaufmann, 1993.
  9. “Big O notation,” 16. 070 Introduction to computers and programming, MIT, http://web.mit.edu/16.070/www/lecture/big_o.pdf.
  10. L. Breiman, “Bagging predictors,” Machine Learning, vol. 24, no. 2, pp. 123–140, 1996. View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  11. W. W. Cohen, “Fast effective rule Induction,” in Proceedings of the 12th International Conference on Machine Learning, pp. 115–123, Tahoe City, Calif, USA, 1995.
  12. H. Boström, “Concurrent learning of large-scale random forests,” in Proceedings of the Scandinavian Conference on Artificial Intelligence, pp. 20–29, Trondheim, Norway, 2011.
  13. J. R. Quinlan, “Induction of decision trees,” Machine Learning, vol. 1, no. 1, pp. 81–106, 1986. View at Publisher · View at Google Scholar · View at Scopus
  14. B. Efron and R. Tibshirani, “Improvements on cross-validation: the .632+ bootstrap method,” Journal of the American Statistical Association, vol. 92, no. 438, pp. 548–560, 1997. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  15. Class Random Forest, http://weka.sourceforge.net/doc/weka/classifiers/trees/RandomForest.html.
  16. L. Breiman and A. Cutler, “Random Forests,” http://www.stat.berkeley.edu/users/breiman/RandomForests/.
  17. R. Genuer, J. Poggi, and C. Tuleau, “Random Forests: some methodological insights,” Tech. Rep. inria00340725, INRIA, 2008. View at Google Scholar
  18. A. Liaw and M. Wiener, “Classification and regression by randomForest,” R News, vol. 2-3, pp. 18–22, 2002. View at Google Scholar
  19. L. Breiman and A. Cutler, “Random Forests,” http://www.stat.berkeley.edu/~breiman/RandomForests/cc_home.htm.
  20. A. Frank and A. Asuncion, “UCI machine learning repository,” University of California, School of Information and Computer Science, Irvine, Calif, USA, 2010, http://archive.ics.uci.edu/ml.
  21. WEKA, http://www.cs.waikato.ac.nz/ml/weka/.
  22. “Salford systems-random forests,” http://www.salford-systems.com/en/products/randomforests.
  23. “The R project for statistical computing,” http://www.r-project.org/.
  24. M. A. Hall, Correlation-based feature subset selection for machine learning [Ph.D. thesis], The University of Waikato, Hamilton, New Zealand, 1999.