Table of Contents Author Guidelines Submit a Manuscript
Computational Intelligence and Neuroscience
Volume 2016, Article ID 1537325, 13 pages
http://dx.doi.org/10.1155/2016/1537325
Research Article

Metaheuristic Algorithms for Convolution Neural Network

1Machine Learning and Computer Vision Laboratory, Faculty of Computer Science, Universitas Indonesia, Depok 16424, Indonesia
2Computer System Laboratory, STMIK Jakarta STI&K, Jakarta 12140, Indonesia

Received 29 January 2016; Revised 15 April 2016; Accepted 10 May 2016

Academic Editor: Martin Hagan

Copyright © 2016 L. M. Rasdi Rere et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. M. M. Najafabadi, F. Villanustre, T. M. Khoshgoftaar, N. Seliya, R. Wald, and E. Muharemagic, “Deep learning applications and challenges in big data analytics,” Journal of Big Data, vol. 2, no. 1, pp. 1–21, 2015. View at Publisher · View at Google Scholar
  2. L. Deng and D. Yu, Deep Learning: Methods and Application, Foundation and Trends in Signal Processing, Redmond, Wash, USA, 2013.
  3. J. L. Sweeney, Deep learning using genetic algorithms [M.S. thesis], Department of Computer Science, Rochester Institute of Technology, Rochester, NY, USA, 2012.
  4. P. O. Glauner, Comparison of training methods for deep neural networks [M.S. thesis], 2015.
  5. L. M. R. Rere, M. I. Fanany, and A. M. Arymurthy, “Simulated annealing algorithm for deep learning,” Procedia Computer Science, vol. 72, pp. 137–144, 2015. View at Publisher · View at Google Scholar
  6. Q. V. Le, J. Ngiam, A. Coates, A. Lahiri, B. Prochnow, and A. Y. Ng, “On optimization methods for deep learning,” in Proceedings of the 28th International Conference on Machine Learning (ICML '11), pp. 265–272, Bellevue, Wash, USA, July 2011. View at Scopus
  7. J. Martens, “Deep learning via hessian-free optimization,” in Proceedings of the 27th International Conference on Machine Learning, Haifa, Israel, 2010.
  8. G. E. Hinton and R. R. Salakhutdinov, “Reducing the dimensionality of data with neural networks,” Science, vol. 313, no. 5786, pp. 504–507, 2006. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  9. O. Vinyal and D. Poyey, “Krylov subspace descent for deep learning,” in Proceedings of the 15th International Conference on Artificial Intelligent and Statistics (AISTATS), La Palma, Spain, 2012.
  10. X.-S. Yang, Engineering Optimization: An Introduction with Metaheuristic Application, John Wiley & Sons, Hoboken, NJ, USA, 2010.
  11. Z. You and Y. Pu, “The genetic convolutional neural network model based on random sample,” International Journal of u- and e-Service, Science and Technology, vol. 8, no. 11, pp. 317–326, 2015. View at Publisher · View at Google Scholar
  12. G. Rosa, J. Papa, A. Marana, W. Scheire, and D. Cox, “Fine-tuning convolutional neural networks using harmony search,” in Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications, A. Pardo and J. Kittler, Eds., vol. 9423 of Lecture Notes in Computer Science, pp. 683–690, 2015. View at Publisher · View at Google Scholar
  13. LiSA-Lab, Deep Learning Tutorial Release 0.1, University of Montreal, Montreal, Canada, 2014.
  14. E.-G. Talbi, Metaheuristics: From Design to Implementation, John Wiley & Sons, Hoboken, NJ, USA, 2009.
  15. I. Boussa{\"{\i}}d, J. Lepagnot, and P. Siarry, “A survey on optimization metaheuristics,” Information Sciences, vol. 237, pp. 82–117, 2013. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  16. K. S. Lee and Z. W. Geem, “A new meta-heuristic algorithm for continuous engineering optimization: harmony search theory and practice,” Computer Methods in Applied Mechanics and Engineering, vol. 194, no. 36–38, pp. 3902–3933, 2005. View at Publisher · View at Google Scholar · View at Scopus
  17. S. Kirkpatrick, C. D. Gelatt Jr., and M. P. Vecchi, “Optimization by simulated annealing,” Science, vol. 220, no. 4598, pp. 671–680, 1983. View at Publisher · View at Google Scholar · View at MathSciNet
  18. N. Noman, D. Bollegala, and H. Iba, “An adaptive differential evolution algorithm,” in Proceedings of the IEEE Congress of Evolutionary Computation (CEC '11), pp. 2229–2236, June 2011. View at Publisher · View at Google Scholar · View at Scopus
  19. Z. W. Geem, J. H. Kim, and G. V. Loganathan, “A new heuristic optimization algorithm: harmony search,” Simulation, vol. 76, no. 2, pp. 60–68, 2001. View at Publisher · View at Google Scholar · View at Scopus
  20. Y. Bengio, “Learning deep architectures for AI,” Foundation and Trends in Machine Learning, vol. 2, no. 1, pp. 1–127, 2009. View at Publisher · View at Google Scholar
  21. Y. LeCun, K. Kavukcuoglu, and C. Farabet, “Convolutional networks and applications in vision,” in Proceedings of the IEEE International Symposium on Circuits and Systems, pp. 253–256, Paris, France, June 2010. View at Publisher · View at Google Scholar · View at Scopus
  22. A. Krizhevsky, I. Sutskever, and G. E. Hinton, “ImageNet classification with deep convolutional neural networks,” in Proceedings of the 26th Annual Conference on Neural Information Processing Systems (NIPS '12), pp. 1097–1105, Lake Tahoe, Nev, USA, December 2012. View at Scopus
  23. R. Palm, Prediction as a candidate for learning deep hierarchical models of data [M.S. thesis], Technical University of Denmark, 2012.
  24. A. Vedaldi and K. Lenc, “MatConvNet: convolutional neural networks for matlab,” in Proceedings of the 23rd ACM International Conference on Multimedia, pp. 689–692, Brisbane, Australia, October 2015. View at Publisher · View at Google Scholar