Table of Contents Author Guidelines Submit a Manuscript
Mathematical Problems in Engineering
Volume 2014 (2014), Article ID 327142, 9 pages
http://dx.doi.org/10.1155/2014/327142
Research Article

An Incremental Classification Algorithm for Mining Data with Feature Space Heterogeneity

Yu Wang1,2

1School of Economic and Business Administration, Chongqing University, Chongqing 400030, China
2Chongqing Key Laboratory of Logistics, Chongqing University, Chongqing 400044, China

Received 16 December 2013; Accepted 13 January 2014; Published 19 February 2014

Academic Editor: J. J. Judice

Copyright © 2014 Yu Wang. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. J. H. Friedman, “Flexible metric nearest neighbor classification,” Tech. Rep., Stanford University, 1994. View at Google Scholar
  2. C. Cardie and N. Howe, “Improving minority class prediction using case-specific feature weights,” in Proceedings of the 14th International Conference on Machine Learning, 1997.
  3. P. Domingos, “Context-sensitive feature selection for lazy learners,” Artificial Intelligence Review, vol. 11, no. 1–5, pp. 227–253, 1997. View at Publisher · View at Google Scholar · View at Scopus
  4. C. Apte, S. J. Hong, J. Hosking, J. Lepre, E. Pednault, and B. Rosen, “Decomposition of heterogeneous classification problems,” Intelligent Data Analysis, vol. 2, no. 1, pp. 81–96, 1998. View at Publisher · View at Google Scholar
  5. Z. Lazarevic, T. Fiez, and Z. Obradovic, “Adaptive boosting for spatial functions with unstable driving attributes,” in Knowledge Discovery and Data Mining. Current Issues and New Applications, vol. 1805 of Lecture Notes in Computer Science, pp. 329–340, 2000. View at Publisher · View at Google Scholar
  6. G. M. Allenby and P. E. Rossi, “Marketing models of consumer heterogeneity,” Journal of Econometrics, vol. 89, no. 1-2, pp. 57–78, 1998. View at Publisher · View at Google Scholar · View at Scopus
  7. W. S. Desarbo, A. Ansari, P. Chintagunta et al., “Representing heterogeneity in consumer response models,” Marketing Letters, vol. 8, no. 3, pp. 335–348, 1997. View at Publisher · View at Google Scholar · View at Scopus
  8. Z. Hua, S. Li, and Z. Tao, “A rule-based risk decision-making approach and its application in China's customs inspection decision,” Journal of the Operational Research Society, vol. 57, no. 11, pp. 1313–1322, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  9. K. J. Cios and G. William Moore, “Uniqueness of medical data mining,” Artificial Intelligence in Medicine, vol. 26, no. 1-2, pp. 1–24, 2002. View at Publisher · View at Google Scholar · View at Scopus
  10. Y. Liu, Y. Liu, and K. C. C. Chan, “Dimensionality reduction for heterogeneous dataset in rushes editing,” Pattern Recognition, vol. 42, no. 2, pp. 229–242, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  11. J.-T. Wong and Y.-S. Chung, “Analyzing heterogeneous accident data from the perspective of accident occurrence,” Accident Analysis and Prevention, vol. 40, no. 1, pp. 357–367, 2008. View at Publisher · View at Google Scholar · View at Scopus
  12. T. G. Dietterich, “Machine-learning research: four current directions,” AI Magazine, vol. 18, no. 4, pp. 97–136, 1997. View at Google Scholar · View at Scopus
  13. C. Giraud-Carrier, “A note on the utility of incremental learning,” AI Communications, vol. 13, no. 4, pp. 215–223, 2000. View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  14. X. Li and N. Ye, “A supervised clustering and classification algorithm for mining data with mixed variables,” IEEE Transactions on Systems, Man, and Cybernetics A, vol. 36, no. 2, pp. 396–406, 2006. View at Publisher · View at Google Scholar · View at Scopus
  15. T. Hastie and R. Tibshirani, “Discriminant adaptive nearest neighbor classification,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 18, no. 6, pp. 607–616, 1996. View at Publisher · View at Google Scholar · View at Scopus
  16. I. Scrypnyk and T. K. Ho, “Feature selection and training set sampling for ensemble learning on heterogeneous data,” Tech. Rep., DIMACS, 2003. View at Google Scholar
  17. M. K. Lim and S. Y. Sohn, “Cluster-based dynamic scoring model,” Expert Systems with Applications, vol. 32, no. 2, pp. 427–431, 2007. View at Publisher · View at Google Scholar · View at Scopus
  18. L. Breiman, “Bagging predictors,” Machine Learning, vol. 24, no. 2, pp. 123–140, 1996. View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  19. S. Puuronen, V. Terziyan, and A. Tsymbal, “A dynamic integration algorithm for an ensemble of classifiers,” in Foundations of Intelligent Systems, vol. 1609 of Lecture Notes in Computer Science, pp. 592–600, 1999. View at Publisher · View at Google Scholar
  20. T. K. Ho, “The random subspace method for constructing decision forests,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 20, no. 8, pp. 832–844, 1998. View at Publisher · View at Google Scholar · View at Scopus
  21. R. Paredes and E. Vidal, “Learning weighted metrics to minimize nearest-neighbor classification error,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 7, pp. 1100–1110, 2006. View at Publisher · View at Google Scholar · View at Scopus
  22. S. M. Weiss and C. A. Kulikowski, Computer Systems that Learn: Classification and Prediction Methods from Statistics, Neural Nets, Machine Learning, and Expert Systems, Morgan Kaufmann, San Mateo, Calif, USA, 1991.
  23. K. Yamauchi, N. Yamaguchi, and N. Ishii, “Incremental learning methods with retrieving of interfered patterns,” IEEE Transactions on Neural Networks, vol. 10, no. 6, pp. 1351–1365, 1999. View at Publisher · View at Google Scholar · View at Scopus
  24. S.-U. Guan and S. Li, “Incremental learning with respect to new incoming input attributes,” Neural Processing Letters, vol. 14, no. 3, pp. 241–260, 2001. View at Publisher · View at Google Scholar · View at Scopus
  25. L. Su, S. U. Guan, and Y. C. Yeo, “Incremental self-growing neural networks with the changing environment,” Journal of Intelligent Systems, vol. 11, no. 1, pp. 43–74, 2001. View at Google Scholar · View at Scopus
  26. S.-U. Guan and F. Zhu, “An incremental approach to genetic-algorithms-based classification,” IEEE Transactions on Systems, Man, and Cybernetics B, vol. 35, no. 2, pp. 227–239, 2005. View at Publisher · View at Google Scholar · View at Scopus
  27. P. Kang and S. Cho, “Locally linear reconstruction for instance-based learning,” Pattern Recognition, vol. 41, no. 11, pp. 3507–3518, 2008. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  28. S. Theodoridis and K. Koutroumbas, Pattern Recognition, Elsevier, 2nd edition, 2003.
  29. S. Vucetic and Z. Obradovic, “Descovering homogeneous regions in spatial data through competition,” in Proceedings of the 17th International Conference on Machine Learning, 2000.
  30. J. H. Gennari, P. Langley, and D. Fisher, “Models of incremental concept formation,” Artificial Intelligence, vol. 40, no. 1–3, pp. 11–61, 1989. View at Publisher · View at Google Scholar · View at Scopus
  31. C. Avilcs-Cruz, A. Guerin-Deguc, J. L. Voz, and D. Van Cappel, “Enhanced learning for evolutive neural architecture (ELENA),” Tech. Rep. R3-B1-P, Neural Network Group, Universite Catholique de Louvain, Louvain-la-Neuve, Belgium, 1995. View at Publisher · View at Google Scholar
  32. S. J. Raudys and A. K. Jain, “Small sample size effects in statistical pattern recognition: recommendations for practitioners,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 13, no. 3, pp. 252–264, 1991. View at Publisher · View at Google Scholar · View at Scopus
  33. D. G. Luenberger and Y. Ye, Linear and Nonlinear Programming, Springer, New York, NY, USA, 3rd edition, 2008. View at MathSciNet
  34. Y. Kim and W. N. Street, “An intelligent system for customer targeting: a data mining approach,” Decision Support Systems, vol. 37, no. 2, pp. 215–228, 2004. View at Publisher · View at Google Scholar · View at Scopus
  35. Y. Kim, W. N. Street, G. J. Russell, and F. Menczer, “Customer targeting: a neural network approach guided by genetic algorithms,” Management Science, vol. 51, no. 2, pp. 264–276, 2005. View at Publisher · View at Google Scholar · View at Scopus