Table of Contents
ISRN Education
Volume 2012 (2012), Article ID 179824, 11 pages
http://dx.doi.org/10.5402/2012/179824
Research Article

International Education Studies: Increasing Their Linguistic Comparability by Developing Judgmental Reviews

Finnish Institute for Educational Research, University of Jyväskylä, P.O. Box 35, 40014 Jyväskylä, Finland

Received 13 December 2011; Accepted 3 January 2012

Academic Editors: B. Marlow and P. A. Prelock

Copyright © 2012 Inga Arffman. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. J. Sweller, “Cognitive load during problem solving: effects on learning,” Cognitive Science, vol. 12, no. 2, pp. 257–285, 1988. View at Google Scholar · View at Scopus
  2. R. Rueda, “Cultural perspectives in reading: theory and research,” in Handbook of Reading Research, M. Kamil, P. D. Pearson, E.B. Moje, and P. Afflerbach, Eds., vol. 4, pp. 84–104, Routledge, New York, NY, USA, 2011. View at Google Scholar
  3. R. Tourangeau, L. Rips, and K. Rasinski, The Psychology of Survey Response, Cambridge University Press, Cambridge, Mass, USA, 2000.
  4. K. Ercikan, M. J. Gierl, T. McCreith, G. Puhan, and K. Koh, “Comparability of bilingual versions of assessments: sources of incomparability of English and French versions of Canada's national achievement tests,” Applied Measurement in Education, vol. 17, no. 3, pp. 301–321, 2004. View at Google Scholar · View at Scopus
  5. F. Guérin-Pace and A. Blum, “The comparative illusion: the international adult literacy survey,” Population, vol. 12, pp. 215–246, 2000. View at Google Scholar
  6. R. Hambleton, “Issues, designs, and technical guidelines for adapting tests into multiple languages and cultures,” in Adapting Educational and Psychological Tests for Cross-Cultural Assessment, R. Hambleton, P. Merenda, and C. Spielberger, Eds., pp. 3–38, Erlbaum, Mahwah, NJ, USA, 2005. View at Google Scholar
  7. A. Allalouf, “Revising translated differential item functioning items as a tool for improving cross-lingual assessment,” Applied Measurement in Education, vol. 16, no. 1, pp. 55–73, 2003. View at Google Scholar · View at Scopus
  8. I. Arffman, “The problem of equivalence in translating texts in international reading literacy studies. A text analytic study of three English and Finnish texts used in the PISA 2000 reading test,” Research Reports 21, University of Jyväskylä, Institute for Educational Research, Jyväskylä, Finland, 2007. View at Google Scholar
  9. P. Elosua and A. López-Jaúregui, “Potential sources of differential item functioning in the adaptation of tests,” International Journal of Testing, vol. 7, no. 1, pp. 39–52, 2007. View at Google Scholar
  10. K. Ercikan, “Disentangling sources of differential item functioning in multilanguage assessments,” International Journal of Testing, vol. 2, no. 3-4, pp. 199–215, 2002. View at Google Scholar
  11. K. Ercikan, R. Arim, D. Law, J. Domene, F. Gagnon, and S. Lacroix, “Application of think aloud protocols for examining and confirming sources of differential item functioning identified by expert reviews,” Educational Measurement, vol. 29, no. 2, pp. 24–35, 2010. View at Publisher · View at Google Scholar · View at Scopus
  12. A. Grisay, “PISA 2000: differences in item difficulty between English, French and German countries,” Unpublished manuscript, 2004.
  13. G. Solano-Flores, E. Backhoff, and L. Contreras-Niño, “Theory of test translation error,” International Journal of Testing, vol. 9, no. 2, pp. 78–91, 2009. View at Google Scholar
  14. G. Engelhard, L. Hansche, and K. Rutledge, “Accuracy of bias review judges in identifying differential item functioning on teacher certification tests,” Applied Measurement in Educatio, vol. 3, no. 4, pp. 347–360, 1990. View at Publisher · View at Google Scholar
  15. M. J. Gierl and S. N. Khaliq, “Identifying sources of differential item and bundle functioning on translated achievement tests: a confirmatory analysis,” Journal of Educational Measurement, vol. 38, no. 2, pp. 164–187, 2001. View at Google Scholar · View at Scopus
  16. L. Roussos and W. Stout, “A multidimensionality-based DIF analysis paradigm,” Applied Psychological Measurement, vol. 20, no. 4, pp. 355–371, 1996. View at Google Scholar · View at Scopus
  17. A. Grisay and C. Monseur, “Measuring the equivalence of item difficulty in the various versions of an international test,” Studies in Educational Evaluation, vol. 33, no. 1, pp. 69–86, 2007. View at Publisher · View at Google Scholar · View at Scopus
  18. A. Grisay, E. Gonzalez, and C. Monseur, “Equivalence of item difficulties across national versions in PIRLS and PISA reading assessments,” IERI Monograph Series: Issues and Methodologies in Large-Scale Assessments, vol. 2, pp. 63–84, 2009. View at Google Scholar
  19. A. Grisay, H. de Jong, E. Gebhardt, A. Berezner, and B. Halleux-Monseur, “Translation equivalence across PISA countries,” Journal of Applied Measurement, vol. 8, no. 3, pp. 249–266, 2007. View at Google Scholar · View at Scopus
  20. R. Adams and M. Wu, Eds., PISA 2000 Technical Report, OECD, Paris, France, 2002.
  21. R. Adams, “Scaling PISA cognitive data,” PISA 2000 Technical Report, OECD, Paris, France, 2002, Edited by R. Adams, M. Wu. View at Google Scholar
  22. J. McQueen and J. Mendelovits, “PISA reading: cultural equivalence in a cross-cultural study,” Language Testing, vol. 20, pp. 208–224, 2003. View at Google Scholar
  23. R. Adams and C. Carstensen, “Scaling outcomes,” PISA 2000 Technical Report, OECD, Paris, France, 2002, Edited by R. Adams, M. Wu. View at Google Scholar
  24. OECD, Translation, Adaptation and Verification of Test Material in OECD International Surveys, OECD, Paris, France, 2010.
  25. J. Olson, M. Martin, and I. Mullis, Eds., TIMSS 2007 Technical Report, TIMSS & PIRLS International Study Center, Boston College, Chestnut Hill, Mass, USA, 2008.
  26. OECD, Reading for Change. Performance and Engagement Across Countries. Results from PISA 2000, OECD, Paris, France, 2002.
  27. OECD, PISA 2006 Technical Report, OECD, Paris, France, 2009.
  28. OECD, National Project Manager’s Manual, OECD, Paris, France, 1999.
  29. OECD, Report on the dodgy items detected in Finland in the PISA 2000 reading literacy assessment, Unpublished raw data, 2000.
  30. OECD, National Project Manager’s Manual, OECD, Paris, France, 2000.
  31. OECD, NPM Manual PISA 2009. National Project Managers’ Manual for the PISA 2009 Survey, OECD, Paris, France, 2009.
  32. Finnish Institute for Educational Research, Finland’s report on the judgmental review of the dodgy items in the PISA 2000 reading literacy assessment, Unpublished raw data, 2000.
  33. S. M. Corey, Action Research to Improve School Practices, Teachers College Press, New York, NY, USA, 1953.
  34. J. McKernan, Curriculum Action Research, Kogan Page, London, UK, 2nd edition, 2000.
  35. M.Q. Patton, Qualitative Research & Evaluation Methods, Sage, Thousand Oaks, Calif, USA, 3rd edition, 2002.
  36. I. S. Kirsch, The International Adult Literacy Survey (IALS): Understanding what was Measured, Educational Testing Service, Princeton, NJ, USA, 2001.
  37. P. B. Mosenthal and I. S. Kirsch, “Toward an explanatory model of document literacy,” Discourse Processes, vol. 14, pp. 147–180, 1991. View at Google Scholar
  38. C. Alderson, Assessing Reading, Cambridge University Press, Cambridge, UK, 2000.
  39. J. Chall and E. Dale, Readability Revisited: The New Chall-Dale Readability Formula, Brookline Books, Cambridge, Mass, USA, 1995.
  40. A. Allalouf, R. Hambleton, and S. Sireci, “Identifying the causes of dif in translated verbal items,” Journal of Educational Measurement, vol. 36, no. 3, pp. 185–198, 1999. View at Google Scholar · View at Scopus
  41. M. A. K. Halliday, An Introduction to Functional Grammar, Arnold, London, UK, 2nd edition, 1994.
  42. A. Baugh and T. Cable, A History of the English Language, Routledge, London, UK, 2002.
  43. F. Karlsson, Finnish: An Essential Grammar, Routledge, London, UK, 1999.
  44. R. Ingo, Suomen kieli vieraan silmin, University of Vaasa, Vaasa, Finland, 2000, [The Finnish language through the eyes of a foreigner] (Research Group for LSP and Theory of Translation, No. 26).
  45. R. C. Anderson and A. Davison, “Conceptual and empirical bases of readability formulas,” in Linguistic Complexity and Text Comprehension: Readability Issues Reconsidered, A. Davison and G. Green, Eds., pp. 23–53, Erlbaum, Hillsdale, NJ, USA, 1988. View at Google Scholar
  46. I. Schlesinger, Sentence Structure and the Reading Process, Mouton, The Hague The Netherlands, 1968.
  47. H. Yildirim and G. Berberoğlu, “Judgmental and statistical DIF analyses of the PISA-2003 mathematics literacy items,” International Journal of Testing, vol. 9, pp. 108–121, 2009. View at Google Scholar
  48. C. Jeanrie and R. Bertrand, “Translating tests with the international test commission's guidelines: keeping validity in mind,” European Journal of Psychological Assessment, vol. 15, no. 3, pp. 277–283, 1999. View at Google Scholar · View at Scopus
  49. E. Nida and C. Taber, The Theory and Practice of Translation, Brill, Leiden, The Netherlands, 1969.
  50. O. Behling and K. Law, Translating Questionnaires and Other Research Instruments: Problems and Solutions, Sage, Thousand Oaks, Calif, USA, 2000, (Sage University Papers Series on Quantitative Applications in the Social Sciences, no. 07–133).
  51. G. Bonnet, F. Daems, C. de Clopper et al., Culturally Balanced Assessment of Reading (c-bar). A European Project, European Network of Policy Makers for the Evaluation of Education Systems, Paris, France, 2003.