About this Journal Submit a Manuscript Table of Contents
Journal of Applied Mathematics
Volume 2013 (2013), Article ID 754698, 13 pages
http://dx.doi.org/10.1155/2013/754698
Research Article

Cost-Sensitive Feature Selection of Numeric Data with Measurement Errors

Laboratory of Granular Computing, Zhangzhou Normal University, Zhangzhou 363000, China

Received 24 December 2012; Accepted 22 March 2013

Academic Editor: Jung-Fa Tsai

Copyright © 2013 Hong Zhao et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. P. Lanzi, “Fast feature selection with genetic algorithms: a filter approach,” in Proceedings of the IEEE International Conference on Evolutionary Computation, 1997.
  2. T. L. B. Tseng and C. C Huang, “Rough set-based approach to feature selection in customer relationship management,” Omega, vol. 35, no. 4, pp. 365–383, 2007.
  3. N. Zhong, J. Z. Dong, and S. Ohsuga, “Using rough sets with heuristics to feature selection,” Journal of Intelligent Information Systems, vol. 16, no. 3, pp. 199–214, 2001.
  4. H. Liu and H. Motoda, Feature Selection for Knowledge Discovery and Data Mining, vol. 454, Springer, 1998.
  5. Y. Weiss, Y. Elovici, and L. Rokach, “The CASH algorithm-cost-sensitive attribute selection using histograms,” Information Sciences, vol. 222, pp. 247–268, 2013. View at Publisher · View at Google Scholar
  6. C. Elkan, “The foundations of cost-sensitive learning,” in Proceedings of the 7th International Joint Conference on Artificial Intelligence, 2001.
  7. W. Fan, S. Stolfo, J. Zhang, and P. Chan, “Adacost: misclassification cost-sensitive boosting,” in Proceedings of the 16th International Conference on Machine Learning, 1999.
  8. E. B. Hunt, J. Marin, and P. J. Stone, Experiments in Induction, Academic Press, New York, NY, USA, 1966.
  9. M. Pazzani, C. Merz, P. M. K. Ali, T. Hume, and C. Brunk, “Reducing misclassification costs,” in Proceedings of the 11th International Conference of Machine Learning (ICML '94), Morgan Kaufmann, 1994.
  10. G. Fumera and F. Roli, “Cost-sensitive learning in support vector machines,” in Proceedings of VIII Convegno Associazione Italiana per L’ Intelligenza Artificiale, 2002.
  11. C. X. Ling, Q. Yang, J. N. Wang, and S. C. Zhang, “Decision trees with minimal costs,” in Proceedings of the 21st International Conference on Machine learning, 2004.
  12. R. Greiner, A. J. Grove, and D. Roth, “Learning cost-sensitive active classifiers,” Artificial Intelligence, vol. 139, no. 2, pp. 137–174, 2002. View at Publisher · View at Google Scholar · View at MathSciNet
  13. S. Ji and L. Carin, “Cost-sensitive feature acquisition and classification,” Pattern Recognition, vol. 40, pp. 1474–1485, 2007. View at Publisher · View at Google Scholar
  14. N. Lavrac, D. Gamberger, and P. Turney, “Cost-sensitive feature reduction applied to a hybrid genetic algorithm,” in Proceedings of the 7th International Workshop on Algorithmic Learning Theory (ALT '96), 1996.
  15. F. Min, H. P. He, Y. H. Qian, and W. Zhu, “Test-cost-sensitive attribute reduction,” Information Sciences, vol. 181, pp. 4928–4942, 2011.
  16. R. Susmaga, “Computation of minimal cost reducts,” in Foundations of Intelligent Systems, Z. Ras and A. Skowron, Eds., vol. 1609 of Lecture Notes in Computer Science, pp. 448–456, Springer, Berlin, Germany, 1999.
  17. F. Min and W. Zhu, “Minimal cost attribute reduction through backtracking,” in Proceedings of the International Conference on Database Theory and Application, vol. 258 of FGIT-DTA/BSBT, CCIS, 2011.
  18. F. Min and Q. Liu, “A hierarchical model for test-cost-sensitive decision systems,” Information Sciences, vol. 179, no. 14, pp. 2442–2452, 2009. View at Publisher · View at Google Scholar · View at MathSciNet
  19. P. Turney, “Cost-sensitive classification: empirical evaluation of a hybrid genetic decision tree induction algorithm,” Journal of Artificial Intelligence Research, vol. 2, no. 1, pp. 369–409, 1994.
  20. D. Margineantu, “Methods for cost-sensitive learning,” 2001.
  21. S. Norton, “Generating better decision trees,” in Proceedings of the 11th International Joint Conference on Artificial Intelligence, 1989.
  22. M. Núñez, “The use of background knowledge in decision tree induction,” Machine Learning, vol. 6, no. 3, pp. 231–250, 1991.
  23. M. Tan, “Cost-sensitive learning of classification knowledge and its applications in robotics,” Machine Learning, vol. 13, no. 1, pp. 7–33, 1993.
  24. N. Johnson and S. Kotz, Continuous Distributions, John Wiley, New York, NY, USA.
  25. R. A. Johnson and D. W. Wichern, Applied Multivariate Statistical Analysis, vol. 4, Prentice Hall, Englewood Cliffs, NJ, USA, 3rd edition, 1992. View at MathSciNet
  26. F. Min, W. Zhu, H. Zhao, G. Y. Pan, J. B. Liu, and Z. L. Xu, “Coser: cost-senstive rough sets,” 2012, http://grc.fjzs.edu.cn/~fmin/.
  27. Y. Y. Yao, “A partition model of granular computing,” Transactions on Rough Sets I, vol. 3100, pp. 232–253, 2004.
  28. H. Zhao, F. Min, and W. Zhu, “Test-cost-sensitive attribute reduction of data with normal distribution measurement errors,” Mathematical Problems in Engineering, vol. 2013, Article ID 946070, 12 pages, 2013. View at Publisher · View at Google Scholar
  29. T. Y. Lin, “Granular computing on binary relations-analysis of conflict and chinese wall security policy,” in Proceedings of Rough Sets and Current Trends in Computing, vol. 2475 of Lecture Notes in Artificial Intelligence, 2002.
  30. T. Y. Lin, “Granular computing—structures, representations, and applications,” in Lecture Notes in Artificial Intelligence, vol. 2639, 2003.
  31. L. Ma, “On some types of neighborhood-related covering rough sets,” International Journal of Approximate Reasoning, vol. 53, no. 6, pp. 901–911, 2012. View at Publisher · View at Google Scholar · View at MathSciNet
  32. H. Zhao, F. Min, and W. Zhu, “Test-cost-sensitive attribute reduction based on neighborhood rough set,” in Proceedings of the IEEE International Conference on Granular Computing, 2011.
  33. W. Zhu, “Generalized rough sets based on relations,” Information Sciences, vol. 177, no. 22, pp. 4997–5011, 2007. View at Publisher · View at Google Scholar · View at MathSciNet
  34. W. Zhu and F.-Y. Wang, “Reduction and axiomization of covering generalized rough sets,” Information Sciences, vol. 152, pp. 217–230, 2003. View at Publisher · View at Google Scholar · View at MathSciNet
  35. F. Min and W. Zhu, “Attribute reduction of data with error ranges and test costs,” Information Sciences, vol. 211, pp. 48–67, 2012. View at Publisher · View at Google Scholar · View at MathSciNet
  36. Z. Zhou and X. Liu, “Training cost-sensitive neural networks with methods addressing the class imbalance problem,” IEEE Transactions on Knowledge and Data Engineering, vol. 18, no. 1, pp. 63–77, 2006.
  37. H. Zhao, F. Min, and W. Zhu, “A backtracking approach to minimal cost feature selection of numerical data,” Journal of Information & Computational Science. In press.
  38. M. Kukar and I. Kononenko, “Cost-sensitive learning with neural networks,” in Proceedings of the 13th European Conference on Artificial Intelligence (ECAI '98), John Wiley & Sons, Chichester, UK, 1998.
  39. J. Lan, M. Hu, E. Patuwo, and G. Zhang, “An investigation of neural network classifiers with unequal misclassification costs and group sizes,” Decision Support Systems, vol. 48, no. 4, pp. 582–591, 2010.
  40. P. Turney, “Types of cost in inductive concept learning,” in Proceedings of the ICML-2000 Workshop on Cost-Sensitive Learning, 2000.
  41. S. Viaene and G. Dedene, “Cost-sensitive learning and decision making revisited,” European Journal of Operational Research, vol. 166, no. 1, pp. 212–220, 2005.
  42. Z. Pawlak, “Rough sets,” International Journal of Computer and Information Sciences, vol. 11, no. 5, pp. 341–356, 1982. View at Publisher · View at Google Scholar · View at MathSciNet
  43. J. Błaszczyński, S. Greco, R. Słowiński, and M. Szeląg, “Monotonic variable consistency rough set approaches,” International Journal of Approximate Reasoning, vol. 50, no. 7, pp. 979–999, 2009. View at Publisher · View at Google Scholar · View at MathSciNet
  44. Z. Bonikowski, E. Bryniarski, and U. Wybraniec-Skardowska, “Extensions and intentions in the rough set theory,” Information Sciences, vol. 107, no. 1–4, pp. 149–167, 1998. View at Publisher · View at Google Scholar · View at MathSciNet
  45. M. Inuiguchi, Y. Yoshioka, and Y. Kusunoki, “Variable-precision dominance-based rough set approach and attribute reduction,” International Journal of Approximate Reasoning, vol. 50, no. 8, pp. 1199–1214, 2009. View at Publisher · View at Google Scholar · View at MathSciNet
  46. Y. Kudo, T. Murai, and S. Akama, “A granularity-based framework of deduction, induction, and abduction,” International Journal of Approximate Reasoning, vol. 50, no. 8, pp. 1215–1226, 2009. View at Publisher · View at Google Scholar · View at MathSciNet
  47. J. A. Pomykała, “Approximation operations in approximation space,” Bulletin of the Polish Academy of Sciences: Mathematics, vol. 35, no. 9-10, pp. 653–662, 1987. View at MathSciNet
  48. Y. Y. Yao, “Constructive and algebraic methods of the theory of rough sets,” Information Sciences, vol. 109, no. 1–4, pp. 21–47, 1998. View at Publisher · View at Google Scholar · View at MathSciNet
  49. Y. Y. Yao, “Probabilistic rough set approximations,” Journal of Approximate Reasoning, vol. 49, no. 2, pp. 255–271, 2008.
  50. W. Zakowski, “Approximations in the space (u, π),” Demonstratio Mathematica, vol. 16, no. 40, pp. 761–769, 1983. View at MathSciNet
  51. W. Zhu, “Relationship among basic concepts in covering-based rough sets,” Information Sciences, vol. 179, no. 14, pp. 2478–2486, 2009. View at Publisher · View at Google Scholar · View at MathSciNet
  52. W. Zhu and F. Wang, “On three types of covering-based rough sets,” IEEE Transactions on Knowledge and Data Engineering, vol. 19, no. 8, pp. 1131–1144, 2007.
  53. S. Calegari and D. Ciucci, “Granular computing applied to ontologies,” International Journal of Approximate Reasoning, vol. 51, no. 4, pp. 391–409, 2010.
  54. W. Zhu and F. Wang, “Covering based granular computing for conflict analysis,” Intelligence and Security Informatics, pp. 566–571, 2006.
  55. Wikipedia, http://www.wikipedia.org/.
  56. Z. Pawlak, Rough Sets: Theoretical Aspects of Reasoning about Data, Kluwer Academic, Boston, Mass, USA, 1991.
  57. M. Dash and H. Liu, “Feature selection for classification,” Intelligent Data Analysis, vol. 1, no. 1–4, pp. 131–156, 1997.
  58. X. Wang, J. Yang, X. Teng, W. Xia, and R. Jensen, “Feature selection based on rough sets and particle swarm optimization,” Pattern Recognition Letters, vol. 28, no. 4, pp. 459–471, 2007.
  59. W. Siedlecki and J. Sklansky, “A note on genetic algorithms for large-scale feature selection,” Pattern Recognition Letters, vol. 10, no. 5, pp. 335–347, 1989.
  60. C. L. Blake and C. J. Merz, “UCI repository of machine learning databases,” 1998, http://www.ics.uci.edu/~mlearn/mlrepository.html.
  61. Q. H. Liu, F. Li, F. Min, M. Ye, and G. W. Yang, “An efficient reduction algorithm based on new conditional information entropy,” Control and Decision, vol. 20, no. 8, pp. 878–882, 2005 (Chinese).
  62. A. Skowron and C. Rauszer, “The discernibility matrices and functions in information systems,” in Intelligent Decision Support, 1992.
  63. G. Wang, “Attribute core of decision table,” in Proceedings of Rough Sets and Current Trends in Computing, vol. 2475 of Lecture Notes in Computer Science, 2002.