Table of Contents Author Guidelines Submit a Manuscript
Computational Intelligence and Neuroscience
Volume 2018 (2018), Article ID 9293437, 12 pages
https://doi.org/10.1155/2018/9293437
Research Article

Automated Text Analysis Based on Skip-Gram Model for Food Evaluation in Predicting Consumer Acceptance

1Department of Food Science and Biotechnology, Sejong University, Seoul, Republic of Korea
2Department of Computer Science and Engineering, Sejong University, Seoul, Republic of Korea

Correspondence should be addressed to Hyeonjoon Moon; rk.ca.gnojes@noomh

Received 2 June 2017; Revised 27 July 2017; Accepted 7 November 2017; Published 22 January 2018

Academic Editor: Elio Masciari

Copyright © 2018 Augustine Yongwhi Kim et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. J. Lee, S. Jeong, J. O. Rho, and K. Park, “A study on the korean taste adjectives and emotional evaluation scale,” Journal of Next-generation Convergence Information Services Technology, vol. 2, no. 2, pp. 59–66, 2013. View at Google Scholar
  2. J. Lee, D. Ghimire, and J. O. Rho, “Rough clustering of Korean foods based on adjectives for taste evaluation,” in Proceedings of the 2013 10th International Conference on Fuzzy Systems and Knowledge Discovery (FSKD), pp. 472–475, IEEE, Shenyang, China, July 2013.
  3. J. Lee, K. Park, and J. O. Rho, “Fuzzy relation-based analysis of Korean foods and adjectives for taste evaluation,” Journal of Korean Institute of Intelligent Systems, vol. 23, no. 5, pp. 451–459, 2013. View at Google Scholar
  4. J. Turian, L. Ratinov, and Y. Bengio, “Word representations: a simple and general method for semi-supervised learning,” Proceedings of the 48th annual meeting of the association for computational linguistics, pp. 384–394, Association for Computational Linguistics.
  5. T. Mikolov, K. Chen, G. Corrado, and J. Dean, “Efficient estimation of word representations in vector space,” Computer Science, 2013, https://arxiv.org/abs/1301.3781. View at Google Scholar
  6. T. Mikolov, I. Sutskever, K. Chen, G. S. Corrado, and J. Dean, “Distributed representations of words and phrases and their compositionality,” in In Advances in neural information processing systems, pp. 3111–3119, In in neural information processing systems, 2013. View at Google Scholar
  7. X. Rong, “Word2vec parameter learning explained,” 2014, arXiv:1411.2738.
  8. J. Manyika, M. Chui, B. Brown et al., Big data: The next frontier for innovation, competition, and productivity, Big data, The next frontier for innovation, 2011.
  9. S.-J. Kim, H.-J. Eun, and Y.-S. Kim, “Food Recommendation System Using Big Data Based on Scoring Taste Adjectives,” International Journal of u-and e-Service, Science and Technology, vol. 9, pp. 39–52, 2016. View at Google Scholar
  10. http://www.foodnavigator.com/Market-Trends/Big-data-project-set-to-reveal-consumer-food-habits-health.
  11. D. Bahdanau, K. Cho, and Y. Bengio, “Neural machine translation by jointly learning to align and translate,” arXiv preprint arXiv:1409.0473, 2014.
  12. H. G. Lee, J. S. Kim, J. H. Shin, J. Lee, Y. X. Quan, and Y. S. Jeong, papago, A Machine Translation Service with Word Sense Disambiguation and Currency Conversion,.
  13. G. Hinton, L. Deng, D. Yu et al., “Deep neural networks for acoustic modeling in speech recognition: The shared views of four research groups,” IEEE Signal Processing Magazine, vol. 29, no. 6, pp. 82–97, 2012. View at Google Scholar
  14. M. Sahami, S. Dumais, D. Heckerman, and E. Horvitz, “A Bayesian approach to filtering junk e-mail,” in Learning for Text Categorization: Papers from the 1998 workshop, vol. 62, pp. 98–105, 1998. View at Google Scholar
  15. Y. Goldberg, O. Levy et al., word2vec explained: Deriving mikolov et al.'s negative-sampling word-embedding method, 2014, https://arxiv.org/abs/1402.3722.
  16. Y. Bengio, R. Ducharme, P. Vincent, and C. Jauvin, “A neural probabilistic language model,” Journal of machine learning research, vol. 3, pp. 1137–1155, 2003. View at Google Scholar
  17. T. Mikolov, M. Karafiát, L. Burget, J. Cernocký, and S. Khudanpur, “Recurrent neural network based language model,” Interspeech, vol. 2, p. 3, 2010. View at Google Scholar
  18. E. L. Park and S. Cho, “KoNLPy: Korean natural language processing in Python,” in Proceedings of the 26th Annual Conference on Human & Cognitive Language Technology, 2014.
  19. L. V. D. Maaten and G. Hinton, “Visualizing data using t-SNE,” Journal of Machine Learning Research, vol. 9, pp. 2579–2605, 2008. View at Google Scholar
  20. G. Hinton and S. Roweis, “Stochastic neighbor embedding,” NIPS, vol. 15, pp. 833–840, 2002. View at Google Scholar