Table of Contents Author Guidelines Submit a Manuscript
Mathematical Problems in Engineering
Volume 2012, Article ID 670723, 18 pages
http://dx.doi.org/10.1155/2012/670723
Research Article

Prediction of Hydrocarbon Reservoirs Permeability Using Support Vector Machine

1Faculty of Mining, Petroleum and Geophysics, Shahrood University of Technology, P.O. Box 316, Shahrood, Iran
2Department of Industrial Engineering, University of Sistan and Baluchestan, Zahedan, Iran
3Young Researchers Club, Islamic Azad University, Zahedan Branch, Zahedan 98168, Iran

Received 16 July 2011; Revised 8 October 2011; Accepted 1 November 2011

Academic Editor: P. Liatsis

Copyright © 2012 R. Gholami et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. A. Bhatt, Reservoir properties from well logs using neural networks, Ph.D. thesis, Department of Petroleum Engineering and Applied Geophysics, Norwegian University of Science and Technology, A dissertation for the partial fulfillment of requirements, 2002.
  2. W. F. Brace, “Permeability from resistivity and pore shape,” Journal of Geophysical Research, vol. 82, no. 23, pp. 3343–3349, 1977. View at Google Scholar · View at Scopus
  3. D. Tiab and E. C. Donaldsson, Petrophysics, Theory and Practice of Measuring Reservoir Rock and Fluid Properties, Gulf Publishing Co., 2004.
  4. N. Kumar, N. Hughes, and M. Scott, Using Well Logs to Infer Permeability, Center for Applied Petrophysical Studies, Texas Tech University, 2000.
  5. S. Mohaghegh, R. Arefi, S. Ameri, and D. Rose, “Design and development of an artificial neural network for estimation of formation permeability,” in Proceedings of the Petroleum Computer Conference, SPE 28237, pp. 147–154, Dallas, Tex, USA, July—August 1994.
  6. S. Mohaghegh, B. Balan, and S. Ameri, “State-of-the-art in permeability determination from well log data: part 2—verifiable, accurate permeability predictions, the touch-stone of all models,” in Proceedings of the Eastern Regional Conference, SPE 30979, pp. 43–47, September 1995.
  7. B. Balan, S. Mohaghegh, and S. Ameri, “State-of-the-art in permeability determination from well log data, part 1, a comparative study, model development,” in Proceedings of the SPE Eastern Regional Conference and Exhibition, SPE 30978, Morgantown, WVa, USA, 1995.
  8. S. Mohaghegh, S. Ameri, and K. Aminian, “A methodological approach for reservoir heterogeneity characterization using artificial neural networks,” Journal of Petroleum Science and Engineering, vol. 16, pp. 263–274, 1996. View at Google Scholar
  9. Z. Huang, J. Shimeld, M. Williamson, and J. Katsube, “Permeability prediction with artificial neural network modeling in the Venture gas field, offshore eastern Canada,” Geophysics, vol. 61, no. 2, pp. 422–436, 1996. View at Google Scholar · View at Scopus
  10. Y. Zhang, H. A. Salisch, and C. Arns, “Permeability evaluation in a glauconite-rich formation in the Carnarvon Basin, Western Australia,” Geophysics, vol. 65, no. 1, pp. 46–53, 2000. View at Google Scholar · View at Scopus
  11. P. M. Wong, M. Jang, S. Cho, and T. D. Gedeon, “Multiple permeability predictions using an observational learning algorithm,” Computers and Geosciences, vol. 26, no. 8, pp. 907–913, 2000. View at Publisher · View at Google Scholar · View at Scopus
  12. K. Aminian, H. I. Bilgesu, S. Ameri, and E. Gil, “Improving the Simulation of Waterflood Performance With the Use of Neural Networks,” in Proceedings of the SPE Eastern Regional Meeting, SPE 65630, pp. 105–110, October 2000.
  13. K. Aminian, B. Thomas, H. I. Bilgesu, S. Ameri, and A. Oyerokun, “Permeability distribution prediction,” in Proceedings of the SPE Eastern Regional Conference, October 2001.
  14. L. Rolon, Developing intelligent synthetic logs: application to upper devonian units in PA, M.S. thesis, West Virginia University, Morgantown, WVa, 2004.
  15. E. Artun, S. Mohaghegh, and J. Toro, “Reservoir characterization using intelligent seismic inversion,” in Proceedings of the SPE Eastern Regional Meeting, SPE 98012, West Virginia University, Morgantown, WVa, September 2005.
  16. M. Stitson, A. Gammerman, V. Vapnik, V. Vovk, C. Watkins, and J. Weston, Advances in Kernel Methods—Support Vector Learning, MIT Press, Cambridge, Mass, USA, 1999.
  17. V. N. Vapnik, The Nature of Statistical Learning Theory, Springer, New York, NY, USA, 1995. View at Zentralblatt MATH
  18. M. Behzad, K. Asghari, M. Eazi, and M. Palhang, “Generalization performance of support vector machines and neural networks in runoff modeling,” Expert Systems with Applications, vol. 36, no. 4, pp. 7624–7629, 2009. View at Publisher · View at Google Scholar · View at Scopus
  19. V. N. Vapnik, The Nature of Statistical Learning Theory, Springer, New York, NY, USA, 2nd edition, 2000. View at Zentralblatt MATH
  20. V. N. Vapnik, Statistical Learning Theory, Wiley, New York, NY, USA, 1998. View at Zentralblatt MATH
  21. N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector Machines, Cambridge University Press, Cambridge, UK, 2000.
  22. C. M. Bishop, Pattern recognition and machine learning, Springer, New York, NY, USA, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  23. C. Cortes, Prediction of generalization ability in learning machines, Ph.D. thesis, University of Rochester, Rochester, NY, USA, 1995.
  24. L. Wang, Support Vector Machines: Theory and Applications, Springer, Berlin, Germany, 2005.
  25. M. Martinez-Ramon and C. Cristodoulou, Support Vector Machines for Antenna Array Processing and Electromagnetic, Universidad Carlos III de Madrid, Morgan & Claypool, San Rafael, Calif, USA, 2006.
  26. D. Zhou, B. Xiao, and H. Zhou, “Global geometric of SVM classifiers,” Tech. Rep., Institute of automation, Chinese Academy of Sciences, AI Lab, 2002.
  27. K. P. Bennett and E. J. Bredensteiner, Geometry in Learning, Geometry at Work, Mathematical Association of America, Washington, DC, USA, 1998.
  28. S. S. Keerthi, S. K. Shevade, C. Bhattacharyya, and K. R. K. Murthy, “A fast iterative nearest point algorithm for support vector machine classifier design,” IEEE Transactions on Neural Networks, vol. 11, no. 1, pp. 124–136, 2000. View at Google Scholar · View at Scopus
  29. S. Mukherjee, E. Osuna, and F. Girosi, “Nonlinear prediction of chaotic time series using support vector machines,” in Proceedings of the 7th IEEE Workshop on Neural Networks for Signal Processing (NNSP '97), pp. 511–520, Amelia Island, Fla, USA, September 1997.
  30. J. T. Jeng, C. C. Chuang, and S. F. Su, “Support vector interval regression networks for interval regression analysis,” Fuzzy Sets and Systems, vol. 138, no. 2, pp. 283–300, 2003. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  31. K. K. Seo, “An application of one-class support vector machines in content-based image retrieval,” Expert Systems with Applications, vol. 33, no. 2, pp. 491–498, 2007. View at Publisher · View at Google Scholar · View at Scopus
  32. K. Trontl, T. Šmuc, and D. Pevec, “Support vector regression model for the estimation of γ-ray buildup factors for multi-layer shields,” Annals of Nuclear Energy, vol. 34, no. 12, pp. 939–952, 2007. View at Publisher · View at Google Scholar · View at Scopus
  33. A. Widodo and B. S. Yang, “Wavelet support vector machine for induction machine fault diagnosis based on transient current signal,” Expert Systems with Applications, vol. 35, no. 1-2, pp. 307–316, 2008. View at Publisher · View at Google Scholar · View at Scopus
  34. R. Burbidge, M. Trotter, B. Buxton, and S. Holden, “Drug design by machine learning: support vector machines for pharmaceutical data analysis,” Computers and Chemistry, vol. 26, no. 1, pp. 5–14, 2001. View at Publisher · View at Google Scholar · View at Scopus
  35. G. Valentini, “Gene expression data analysis of human lymphoma using support vector machines and output coding ensembles,” Artificial Intelligence in Medicine, vol. 26, no. 3, pp. 281–304, 2002. View at Publisher · View at Google Scholar · View at Scopus
  36. S. Tripathi, V. V. Srinivas, and R. S. Nanjundiah, “Downscaling of precipitation for climate change scenarios: a support vector machine approach,” Journal of Hydrology, vol. 330, no. 3-4, pp. 621–640, 2006. View at Publisher · View at Google Scholar · View at Scopus
  37. C. Sanchez-Hernandez, D. S. Boyd, and G. M. Foody, “Mapping specific habitats from remotely sensed imagery: support vector machine and support vector data description based classification of coastal saltmarsh habitats,” Ecological Informatics, vol. 2, no. 2, pp. 83–88, 2007. View at Publisher · View at Google Scholar · View at Scopus
  38. C. Z. Cai, W. L. Wang, L. Z. Sun, and Y. Z. Chen, “Protein function classification via support vector machine approach,” Mathematical Biosciences, vol. 185, no. 2, pp. 111–122, 2003. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  39. F. E. H. Tay and L. J. Cao, “Modified support vector machines in financial time series forecasting,” Neurocomputing, vol. 48, pp. 847–861, 2002. View at Publisher · View at Google Scholar · View at Scopus
  40. I. Steinwart, Support Vector Machines. Los Alamos National Laboratory, Information Sciences Group (CCS-3), Springer, New York, NY, USA, 2008.
  41. M. Saemi, M. Ahmadi, and A. Y. Varjani, “Design of neural networks using genetic algorithm for the permeability estimation of the reservoir,” Journal of Petroleum Science and Engineering, vol. 59, no. 1-2, pp. 97–105, 2007. View at Publisher · View at Google Scholar · View at Scopus
  42. H. Liu, X. Yao, R. Zhang, M. Liu, Z. Hu, and B. Fan, “The accurate QSPR models to predict the bioconcentration factors of nonionic organic compounds based on the heuristic method and support vector machine,” Chemosphere, vol. 63, no. 5, pp. 722–733, 2006. View at Publisher · View at Google Scholar · View at PubMed · View at Scopus
  43. T. W. Lee, Independent Component Analysis, Theory and Applications Kluwer Academic Publishers, 1998.
  44. A. Hyvarinen, “Fast and robust fixed-point algorithms for independent component analysis,” IEEE Transactions on Neural Networks, vol. 10, no. 3, pp. 626–634, 1999. View at Publisher · View at Google Scholar · View at PubMed · View at Scopus
  45. A. Hyvärinen and E. Oja, “Independent component analysis: algorithms and applications,” Neural Networks, vol. 13, no. 4-5, pp. 411–430, 2000. View at Publisher · View at Google Scholar · View at Scopus
  46. A. Hyvärinen, “Survey on independent component analysis,” Neural Computing Surveys, vol. 2, pp. 94–128, 1999. View at Google Scholar
  47. A. J. Bell and T. J. Sejnowski, “The “independent components” of natural scenes are edge filters,” Vision Research, vol. 37, no. 23, pp. 3327–3338, 1997. View at Publisher · View at Google Scholar · View at Scopus
  48. Q. A. Tran, X. Li, and H. Duan, “Efficient performance estimate for one-class support vector machine,” Pattern Recognition Letters, vol. 26, no. 8, pp. 1174–1182, 2005. View at Publisher · View at Google Scholar · View at Scopus
  49. S. Merler and G. Jurman, “Terminated Ramp-support vector machines: a nonparametric data dependent kernel,” Neural Networks, vol. 19, no. 10, pp. 1597–1611, 2006. View at Publisher · View at Google Scholar · View at PubMed · View at Scopus
  50. H. Liu, S. Wen, W. Li, C. Xu, and C. Hu, “Study on identification of oil/gas and water zones in geological logging base on support-vector machine,” Fuzzy Information and Engineering, vol. 2, pp. 849–857, 2009. View at Google Scholar
  51. Q. Li, L. Jiao, and Y. Hao, “Adaptive simplification of solution for support vector machine,” Pattern Recognition, vol. 40, no. 3, pp. 972–980, 2007. View at Publisher · View at Google Scholar · View at Scopus
  52. H.-J. Lin and J. P. Yeh, “Optimal reduction of solutions for support vector machines,” Applied Mathematics and Computation, vol. 214, no. 2, pp. 329–335, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  53. E. Eryarsoy, G. J. Koehler, and H. Aytug, “Using domain-specific knowledge in generalization error bounds for support vector machine learning,” Decision Support Systems, vol. 46, no. 2, pp. 481–491, 2009. View at Publisher · View at Google Scholar · View at Scopus
  54. B. Schölkopf, A. Smola, and K. R. Müller, “Nonlinear component analysis as a kernel eigenvalue problem,” Neural Computation, vol. 10, no. 5, pp. 1299–1319, 1998. View at Google Scholar · View at Scopus
  55. B. Walczak and D. L. Massart, “The radial basis functions—partial least squares approach as a flexible non-linear regression technique,” Analytica Chimica Acta, vol. 331, no. 3, pp. 177–185, 1996. View at Publisher · View at Google Scholar · View at Scopus
  56. R. Rosipal and L. J. Trejo, “Kernel partial least squares regression in reproducing kernel Hilbert space,” Journal of Machine Learning Research, vol. 2, pp. 97–123, 2004. View at Google Scholar
  57. S. Mika, G. Ratsch, J. Weston, B. Scholkopf, and K. R. Muller, “Fisher discriminant analysis with kernels,” in Proceedings of the 9th IEEE Workshop on Neural Networks for Signal Processing (NNSP '99), pp. 41–48, August 1999.
  58. B. Scholkopf and A. J. Smola, Learning with Kernels, MIT Press, Cambridge, Mass, USA, 2002.
  59. S. R. Gunn, “Support vector machines for classification and regression,” Tech. Rep., Image Speech and Intelligent Systems Research Group, University of Southampton, Southampton, UK, 1997. View at Google Scholar
  60. C. H. Wu, G. H. Tzeng, and R. H. Lin, “A Novel hybrid genetic algorithm for kernel function and parameter optimization in support vector regression,” Expert Systems with Applications, vol. 36, no. 3, pp. 4725–4735, 2009. View at Publisher · View at Google Scholar · View at Scopus
  61. V. D. A. Sánchez, “Advanced support vector machines and kernel methods,” Neurocomputing, vol. 55, no. 1-2, pp. 5–20, 2003. View at Publisher · View at Google Scholar · View at Scopus
  62. Donald F. Specht, “A general regression neural network,” IEEE Transactions on Neural Networks, vol. 2, no. 6, pp. 568–576, 1991. View at Publisher · View at Google Scholar · View at PubMed
  63. Y. B. Dibike, S. Velickov, D. Solomatine, and M. B. Abbott, “Model induction with support vector machines: introduction and applications,” Journal of Computing in Civil Engineering, vol. 15, no. 3, pp. 208–216, 2001. View at Publisher · View at Google Scholar
  64. W. Wang, Z. Xu, W. Lu, and X. Zhang, “Determination of the spread parameter in the Gaussian kernel for classification and regression,” Neurocomputing, vol. 55, no. 3-4, pp. 643–663, 2003. View at Publisher · View at Google Scholar · View at Scopus
  65. D. Han and I. Cluckie, “Support vector machines identification for runoff modeling,” in Proceedings of the 6th International Conference on Hydroinformatics, S. Y. Liong, K. K. Phoon, and V. Babovic, Eds., pp. 21–24, Singapore, June 2004.