Table of Contents Author Guidelines Submit a Manuscript
The Scientific World Journal
Volume 2014, Article ID 135641, 13 pages
http://dx.doi.org/10.1155/2014/135641
Review Article

Creation of Reliable Relevance Judgments in Information Retrieval Systems Evaluation Experimentation through Crowdsourcing: A Review

Department of Information Systems, Faculty of Computer Science and Information Technology, University of Malaya, 50603 Kuala Lumpur, Malaysia

Received 30 December 2013; Accepted 8 April 2014; Published 19 May 2014

Academic Editors: L. Li, L. Sanchez, and F. Yu

Copyright © 2014 Parnia Samimi and Sri Devi Ravana. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Citations to this Article [3 citations]

The following is the list of published articles that have cited the current article.

  • Justin W. Collins, Harko Verhagen, Alexander Mottrie, and Peter N. Wiklund, “Application and Integration of Live Streaming from Leading Robotic Centres Can Enhance Surgical Education,” European Urology, 2015. View at Publisher · View at Google Scholar
  • Germán A. Osorio-Zuluaga, and Néstor Darío Duque Méndez, “Collaborative construction of metadata and full-text dataset,” Proceedings - 2016 11th Latin American Conference on Learning Objects and Technology, LACLO 2016, 2016. View at Publisher · View at Google Scholar
  • Eiji Aramaki, Shuko Shikata, Satsuki Ayaya, and Shin-Ichiro Kumagaya, “Crowdsourced Identification of Possible Allergy-Associated Factors: Automated Hypothesis Generation and Validation Using Crowdsourcing Services,” JMIR Research Protocols, vol. 6, no. 5, pp. e83, 2017. View at Publisher · View at Google Scholar