Education Research International

Education Research International / 2020 / Article

Research Article | Open Access

Volume 2020 |Article ID 6026231 |

Abbas Taghizade, Javad Hatami, Omid Noroozi, Mohammadreza Farrokhnia, Alireza Hassanzadeh, "Fostering Learners’ Perceived Presence and High-Level Learning Outcomes in Online Learning Environments", Education Research International, vol. 2020, Article ID 6026231, 9 pages, 2020.

Fostering Learners’ Perceived Presence and High-Level Learning Outcomes in Online Learning Environments

Academic Editor: José Carlos Núñez
Received31 Oct 2019
Revised29 Jan 2020
Accepted01 May 2020
Published01 Jul 2020


This study investigated the effects of using a teaching model enriched with presence on learners’ perceived presence and high-level learning outcomes in online learning environments. The study was conducted in an Iranian state university with 52 higher education students majoring in electronic IT management who were randomly divided into experimental or control group conditions. The research tools included a rubric to measure learner’s perceived presence and the researcher-made survey to measure learner’s high-level learning outcomes. The results showed that the frequency of the produced semantic units in different types of presence (cognitive, social, and teaching presence) was significantly higher for students in the experimental condition than those in the control group condition. In addition, students in the experimental condition showed more progression in the posttest in terms of their high-level learning outcomes as compared to the students in the control group condition.

1. Introduction

With the advancement of information technologies and their application in educational contexts [1], it is possible to create flexible learning environments without the restriction of time and space [2]. Such flexible environments make learners able to access different learning materials, share their ideas, discuss with other participants, and coconstruct knowledge [36]. Also, many studies have indicated the potentials of online learning environments, in which learners engage actively in complex and high-level learning tasks [5, 79].

Despite the extensive use of online learning environments for various courses in higher education, there is evidence showing that these courses have failed to meet the learners’ needs without pleasant experiences [10]. In this regard, scholars emphasize the importance of interaction (between the students and/or between students and teacher) in high-quality online learning [1113]. Also, attaining higher-order learning outcomes and enhancing the ability to transfer learning to other situations can be best achieved through a community of inquiry, which includes various combinations of interaction among content, teachers, and students [1416].

Garrison et al. [17] constructed a comprehensive conceptual framework designed to capture the educational dynamic of higher education online learning environments. According to the Community of Inquiry (CoI) framework, the learning experiences are formed through the interaction of cognitive presence, social presence, and teaching presence [18]. Garrison et al. [17] claim that the shared nature of cognitive, social, and teaching presence leads to the creation of an inquiry community that provides an enriched cooperative and shared learning experience for learners. Cognitive presence is defined in terms of a cycle of practical inquiry where participants move deliberately from understanding the problem or issue through exploration, integration, and resolution [19]. Cognitive presence can also play a full mediator role between teaching and social presence towards the end of an online learning experience for adult learners [20]. Garrison et al. [17] define social presence as the learners’ ability to recognize the learning community, the feeling of belonging to it, and purposeful communication in a learning community. Teaching presence is defined as planning, facilitating, and directing social and cognitive processes to realize predicted results according to the learners’ capabilities and needs [21].

Therefore, the CoI framework recognizes the importance of the learning environment in shaping and facilitating the educational experience [19]. Research has shown that in computer-supported collaborative learning (CSCL) environments, group activities can be more facilitated through the ease of communication and coordination among group members [2224]. Moreover, the quality of the social interactions between students and students and between students and teachers can also improve, as these make the interactions among users more visible, thus improving their understanding [25]. Within a collaborative technology-supported interactive process, learners are better able to construct meaningful knowledge at a social level [4, 26]. Also, computer-supported collaborative learning environments increase learners’ cognitive performance [27].

On the other hand, Kim et al. [28] claimed that a lot of instructors who are used to lecturing may not be able to implement appropriate instructional approaches to support learning in online learning environments. This claim is in line with the findings of Reiser and Salisbury [29] who claim that the key to the development of thinking ability and high-level learning in any learning environment lies in using suitable instructional strategies that encourage and support the learners’ interpretations and investigations. Following such instructional strategies enables learners to reveal their hidden cognitive processes, emphasize shared intellectual works turning around cooperation to perform the duties, and change their views on the subject in the processes of knowledge construction. Indeed, the nature of the assignments and provided instructional directions play a vital role in engaging the students in a high-level learning process [30].

As a result, in this study, by adapting the CoI as a theoretical framework, we designed and developed an online collaborative learning course by elaborating on an explicit instructional approach for different modes of presence. Also, we investigated the effects of the designed online course on the higher education students’ perceived presence and their high-level learning outcomes by answering the following questions:Research question 1. To what extent does the teaching method enriched with the presence affect students’ high-level learning outcomes in terms of analysis, evaluation, and synthesis?Research question 2. To what extent does the teaching method enriched with the presence affect students’ perceived cognitive, social, and teaching presence?

2. Materials and Methods

2.1. Materials
2.1.1. Online Learning Environment

A Learning Management System (LMS) was used to present the course and instructional material. The applied LMS was based on synchronous communications between the instructor and learners in forms of voice, video, and text and included online synchronized and unsynchronized forums, having the capabilities of public and private discussions. All of the notices needed for the beginning time of classes, exams, and learning resources such as related articles, PowerPoint files, and recorded sessions were contained in this learning environment.

2.1.2. High-Level Learning Outcomes Survey

A survey consisting of 12 open-ended questions was developed to assess students’ high-level learning outcomes. Considering the nature of the course, a managerial situation was chosen as the context for the survey. A panel (consisting of five experts in the field of IT management and two experts in the field of education) works in a brainstorming session to generate several questions related to different high-level learning outcomes levels, i.e., analysis, evaluation, and synthesis. The brainstorming session resulted in 17 questions related to a problematic managerial situation. To detect any possible overlap, three other experts (two IT and one educational expert) were asked to review the questions. As a result, five questions were identified as redundant and consequently discarded. The final survey ended with 12 questions, in which learners had to analyze a managerial situation (4 questions), provide a strategy to solve a managerial problem (4 questions), deal with the assessment of available solutions, and choose the best one to solve a managerial problem (4 questions). Examples of the questions are shown in Table 1.


Q3. Managing Director of X Insurance company believes that today a new era of strategic play has been created in the competitive world of different companies. Today’s companies need the flexibility to succeed in today’s competitive environment. Therefore, you are wanted as a Vice President of information technology, to propose a solution that enables the company to respond quickly to customer preferences, changes in industry norms due to newcomers, and unforeseen barriers to action. Please state your solution reasonably.


Q6. Suppose you are setting up an IT consulting firm, considering the issues discussed in this lesson, analyze the impact that information technology will have on your organization.


Q11. The CEO of Bank M, in the process of developing a digital banking roadmap, asks the IT unit manager to evaluate the bank’s digital reality and current position and design options for the bank’s future digital business model. How would you do it if you were the IT manager of the bank?

For assessing the content validity quantitatively, the method of content validity ratio (CVR) was used. In this regard, nine new IT experts were asked to rate each question using a three-point ordinal scale: 3 = essential, 2 = useful, but not essential, and 1 = not necessary. After expert comments were collected, as suggested by Lawshe [31], the questions that had a CVR higher than 0.78 were maintained, and the remaining questions were left out. The results showed that all questions had a CVR higher than 0.78. Furthermore, the content validity of the final survey was evaluated based on the magnitude of the content validity index (CVI) values as it related to the degree of agreement among the panelists [32]. In a panel consisting of nine experts, a CVI index of greater than 0.80 is a high value, which denotes a high level of agreement [33]. Likewise, a low CVI of less than 0.80 means the items on the instrument does not adequately address the thematic domains being explored because it raises the issue of objectivity and appropriateness [32]. The result of the CVI evaluation showed a high level of agreement among the nine IT experts about the survey and 12 questions (CVI = 0.84).

Finally, to determine the reliability of the survey, the method of interraters’ reliability was employed. For this, six learners were randomly assigned to answer the survey. Four experts scored the answer sheets, and Cohen’s kappa coefficient was determined to be high (K = 0.86, ), showing the high agreement between the raters.

2.2. Participants

The participants were 52 MSc students (6 female and 46 male), one instructor, and four teaching assistants. The instructor was an associate professor with 25 years of teaching experience, and the teaching assistants were Ph.D. students in the field of management. The students were already enrolled in the course but randomly assigned to the experimental (N = 26) and control (N = 26) conditions. All students were in the second semester of their MSc program, with an average age of 24 years for the male students and 29 years for the female students (Table 2). The students already passed, on average, 15 courses in the past, and 80% of them evaluated their perceived capability to use technology as intermediate, based on a 5-point Likert scale. Although most of the sample population consists of male students, these participants reflect the gender distribution of undergraduate students at the university under study.



Note. N = number. M = mean. SD = standard deviation.
2.3. Design and Procedure
2.3.1. Instructional Design of the Course

Arbaugh et al. [34] developed and validated a CoI survey instrument, consisting of 34 agreed upon and statistically validated items that operationalize the concepts in the CoI framework. This survey has been used in previous research to guide design elements ahead of time, or to evaluate their success in supporting the development of an online community of inquiry, once implemented [35]. Indeed, the CoI survey’s items provide insights into the necessary practice-based requirements of each presence [19]. Based on this survey, the following instructional approach was used to facilitate and support each of the presence modes in this study:Cognitive Presence.Creating a naturally confusing, ambiguous, multiple solution problems, in the form of real examples related to learners’ experience; persuading learners to search for information in various sources; persuading learners to share their suggestions and previous experiences; persuading learners to make connections between information obtained; persuading learners to keep asking questions, create knowledge based on others’ ideas, and justify suggested propositions; persuading learners to defend and test new ideas or solutions; encourage learners to reflect on the results of the newly obtained ideas; asking learners to provide evidence to strengthen the suggested claim when confronting contradictory evidence; and determining the role of learners in each group prior to the online discussion (leader of the group, collector of information, organizer of discussions, and analyzer).Social Presence.Persuading learners to use various paralanguages such as signs, capital letters, emoticons, and avatars to enhance emotional and interpersonal connectedness; encouraging learners to share voice messages, images, and videos; teaching social skills and the rules of connectedness prior to the course; determining the consequences of learning to enhance learners’ stimulation; acknowledging and appreciating other’s participation to create an open relationship; respecting and appreciating learners as a friend; and calling the learners’ first name, and making use of “WE” when talking to the group as an indication of the admission of the group by the individual.Teaching Presence.Goals setting; selecting content and learning activities; organizing working groups at the beginning of the course; supervising the learners’ purposeful participation and reflection needs; recognition directing and providing on-time information; distributing teacher’s duties and roles between learners; recognizing other’s misunderstandings; synthesizing knowledge from different sources; summarizing discussion after each session; showing and providing links and structural signs to direct and guide learners; paving the way for learners to access to the resources and related database; recreating the presentation of PowerPoints and notes of lectures in the LMS for learners’ access; providing guidance on how to make use of the media effectively; revising and commenting learners’ answers; preventing learners to control and dominate discussions; stimulating inactive learners; giving sufficient time to do tasks; making deadlines to do tasks; and giving prompt answers to learners questions and problems.

2.4. Study Design

The study has consisted of three different phases, including the following.

2.4.1. Preparation and Personal Study

Prior to the beginning of the online class, to ensure about considering the ethical issues, the participants were fully informed about the main objectives of the research, and the participant’s consent to participate in the research was obtained and recorded. Thereafter, the learners were assigned into four groups consisted of five members, and one group consisted of six members, and one group head was designated for each group. Then, three days before each session, a video containing a managerial situation was sent through LMS, and a problem was posed about it. The criteria for the formulation of problems were their complication, attractiveness, and similarity to learners’ real situation in the framework of a story. Also, the solution was multifaceted, requiring data collection from different views (financial, managerial, and information technology.) to be solved. The sources related to the problem, including relevant articles, chapters of a book, and introducing related sites, were also sent with the problem. Video clips provided in this stage can improve learners’ readiness and effectiveness of solving problems, as it can stimulate learners to search and solve related problems through frequent reviewing of videos [36]. Besides, before the start of the online session, another in-person meeting was held, and along with the familiarity of learners and instructor with each other, the rules of participation and online discussions were discussed.

2.4.2. Online Group Discussion

In this phase, the instructor, teaching assistants, and learners communicated simultaneously in the online learning environment. In each session, the instructor lectured theories and the content relevant to the problem (30 minutes); then, the learners were requested to discuss the problem based on the groups identified in chat rooms (40 minutes); and finally, the instructor and assistants supervised and guided learners and encouraged them to discuss within groups. The synchronous online discussion allowed learners to share and reflect on their ideas without time and space limitations, which helped students learn from multiple perspectives to create knowledge through interactive dialogues [37]. In the same line, Branon and Essex [38] claim that a prompt interaction along with synchronous communication can facilitate feedbacking, knowledge sharing, brainstorming, and decision-making skills, which are considered as important abilities for problem-solving situations.

2.4.3. Feedbacking and Providing Guidance

After online group discussions, the group heads presented the answers to the questions orally and in writing (5 minutes), and the groups were able to change or complete their answers based on the new insight obtained through instructor’s, assistant’s, and other groups feedback. It has been shown that discussions of the whole class and orally assessment of the peers make learners able to enhance their reflection on new situations and, also, their self-assessment and awareness [39]. In addition, the instructor’s feedback can be useful for the learner in the clarification of objectives, increasing their commitment, and learning attempts [40].

It must be noted that in the control group, the instructor in each session presented the topic in 60 minutes and posted a problem related to the topic, and the learners in the group, discussed and talked about it. Here, the instructor’s guided the group, if necessary, and provided the answer in the end. After the end, of course, students in both conditions had a posttest to measure high-level learning outcomes.

2.5. Analysis
2.5.1. Discussions Analysis Checklist

The discussions recorded in the designed online learning environment (between learner-learner and educator-learner) were coded by making use of the qualitative content analysis based on protocols of social presence [41], teaching presence [42], and cognitive presence [43] (refer to Supplementary Appendix 1 for the examples). The coding of online discussion contents was carried out by two coders who were advised to take into account any meaningful piece of a message as a code that can be placed in a category. This method of coding is called analysis of semantic unit or thematic unit [41]. To ensure the reliability of the coding process, the coders received extensive training on applying the coding scheme while they were given the opportunity to practice with samples of online discussion contents for each of the presence types. The interrater reliability of coding the massages in terms of different presence for both the experiment group and control group was sufficiently high (Table 3).

GroupCognitive presenceSocial presenceTeaching presence


2.6. Data Analysis Method

To analyze the data, descriptive statistics (mean, standard deviation), and inferential statistics, i.e., the multivariate analysis of covariance (MANCOVA) and the chi-squared test methods were used.

3. Results and Discussion

3.1. Descriptive Information for the Dependent Variables

In Table 4, high-level learning outcomes of students in both the experimental group and control group conditions in pretest and posttest are presented.

VariablesExperimental groupControl group


M = mean. SD = standard deviation.
3.1.1. Research Question 1

The first research question focused on the effect of the designed online course based on students’ high-level learning elements. The MANCOVA revealed an overall effect of the conditions on the students’ high-level learning outcomes (Wilks’ λ = .412, F(3, 47) = 22.35, , η2 = .58). The differences between the two conditions, at least, in one of the three dependent variables, are significant. The Bonferroni post hoc test was used to determine the differences between the experimental group and control group conditions in terms of high-level learning outcomes. Students’ learning at the analysis level was significantly () higher for students in the experimental group condition (M = 19.23, SD = 0.91) than for those in the control group condition (M = 16.29, SD = 1.12) in posttest. Also, students’ learning in the experimental group condition at evaluation (M = 17.91, SD = 0.85) and synthesis (M = 17.16, SD = 0.78) levels were significantly higher than students’ learning in the control group condition at evaluation (M = 14.52, SD = 0.92) and synthesis (M = 14.86, SD = 0.79) levels (, ).

3.1.2. Research Question 2

This research question focused on the effect of the designed online course enriched with presence on students’ perceived presence. To answer this question, 1588 semantic units were coded. The frequencies and percentages of semantic units obtained from online discussions related to each of presences and their related categories for both groups are shown in Table 5.

Categories of cognitive presenceCategories of social presenceCategories of teaching presence
Triggering eventExplorationIntegrationResolutionTotalExpressing emotionOpen relationshipGroup coherenceTotalPlanning and organizingFacilitating of dialoguesDirecting teachingTotal


The calculated chi-squared for the different presence and their categories in two conditions showed that frequencies of produced semantic units for students in the experimental group condition are significantly higher than the control group condition in cognitive (x2(7, N = 52) = 12.20, ), social (x2(5, N = 52) = 6.19, ), and teaching presence categories (x2(5, N = 52) = 6.42, ).

4. Discussions

4.1. High-Level Learning Outcomes

Aligned with the previous results [44, 45], the findings of this study corroborate the positive effect of the proposed instructional approaches based on the CoI framework on students’ high-level learnings (analysis, evaluation, and synthesis). As the teaching model enriched with presence is derived from the constructivist approach, it can be said that the present model can improve high-level learning outcomes. According to Chiu et al. [46], constructivism teaching strategies can lead to the improvement of high-level thinking skills, especially critical thinking, in which the instructor should attempt to create for learners in enriched learning environments.

4.2. Students’ Perceived Presence

The results of the content analysis showed that the frequency of produced semantic units in various categories of presence types (cognitive, social, and teaching) for students in experimental conditions were significantly higher than those students in the control group condition. The results obtained for cognitive presence are consistent with the results obtained from previous studies [44, 47].

Regarding cognitive presence, the highest frequency was in the exploration category, as also was indicated in previous studies [18, 35]. The integration category had the highest frequency next to the exploration category which was, to some extent, higher than the previous research [48] showing that the learners were able to connect and integrate various ideas and formulate possible solutions for problems. The resolution category was of the lowest frequency among cognitive presence categories. Regarding this, as Akyol and Garrison [49] have indicated, “Limitations of time is an obstacle to reach the resolution category because the learners do not have enough time to share the results of their solutions.” Also, Garrison et al. [50] believed that achieving the resolution category requires opportunities and clear expectations for making use of the created knowledge. In other words, as many learners have not had working experience as an IT manager, their discussions have stopped in the integration category, have not been connected to real working environments, and have been limited to the knowledge they have obtained as an IT manager. So, it can be said that the allocation of further time for discussions and presentation of more tangible managerial cases can be effective for reinforcing the resolution category of cognitive presence.

According to Garrison and Arbaugh [51]; the social presence will be prevailed along with the progression of the online course. The results of the analysis of social presence in this study also confirm this argument, as the frequency of messages related to social presence elements in both groups has increased over the time, and the highest frequency relates to the group coherence. However, the results obtained contradict the results by Swan and Shea [52] who showed that the indices of group coherence reduce along with the progression of the course. Besides, the analysis of the teaching presence also indicates a variety in its elements’ frequency for experimental and control conditions. For both conditions, the designing and organizing element was very high in the beginning but reduced along with session’s progression and was replaced by the elements of discourse facilitating and directing instruction, which is perfectly in accord with the principles of constructivism approaches.

5. Conclusions and Limitations

The general conclusion that can be drawn from this study is that learners’ perceived presence and high-level learning outcomes can be facilitated through instructional design enriched by the different presence in online learning environments. The cognitive presence, which has its roots in critical thinking, provides a hierarchical framework for assessing learners’ thinking processes and their abilities for realizing enriched learning levels [53]. Also, it paves the way for conceptualization, assessment, and differentiation between learners’ changing levels of critical thinking. To establish high-level learning in an interactive online context, online discussions should demand learners’ cognitive participation to integrate, combine, and assess the discussion ideas [54]. To achieve this goal, the strategies that allow learners to establish a community of inquiry through which they can participate in a meaningful critical discussion should be used, which necessitates cognitive presence.

Besides, the social presence is an indication of social dynamics and the quality of relationships between learners. Online learning demands learners who share their personal experiences and views, requiring a sense of connectedness, respect, and trust. According to Weidlich and Bastiaens [55], fostering social presence in an online learning environment means to create an atmosphere supporting and encouraging questioning, skepticism, and providing more explanatory ideas. This can further result in learners’ stimulation and satisfaction and improving their academic achievement and learning in the end [56, 57].

Moreover, while the interactions between learners in virtual learning environments are necessary, they do not guarantee online effective learning [18]. There should be clear parameters defined for these interactions focusing on a certain direction, that is, the necessity of teaching presence. The experimental studies supported this approach based on the evidence illustrating a strong relationship between teaching presence quality, learning, and learners’ satisfaction. Shea et al. [58] suggested that using a teaching presence for measuring instructional attempts is more useful than theoretical indices such as the total number of messages or the hours spent online. It seems that the teaching presence element will lead to establishment, administration, and harmonizing online-qualified instructional activities and experiences, which support, improve, and direct productive inquiry communities which, finally, improve learners’ academic achievement and learning [5961].

However, this study has some limitations that must be taken into account. One of the limitations was that the sample size was small, which reduces the generalizability of the results. In addition, making use of convenience sampling instead of random sampling reduces the generalizability of results to the same settings. Therefore, it is expected that the future studies support these results with a larger study population to assess the learners’ conceptions regarding cooperative learning facilitated by CSCL environments [4, 5, 62], which can provide more details on the capabilities of such environments designed to build a better community of inquiry framework.

Data Availability

The data that support the findings of this study are available from the corresponding author upon reasonable request.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this article.

Supplementary Materials

Appendix 1 includes the coding scheme used for coding the online discussions among students. Also, the example discussion massages for different types of presence and their categories and indices are presented in this table. (Supplementary Materials)


  1. T. Bates, “The 2017 national survey of online learning in Canadian post-secondary education: methodology and results,” International Journal of Educational Technology in Higher Education, vol. 15, no. 1, 2018. View at: Publisher Site | Google Scholar
  2. R. Martin, T. McGill, and F. Sudweeks, “Learning anywhere, anytime: student motivators for m-learning,” Journal of Information Technology Education: Research, vol. 12, no. 1, pp. 51–67, 2013. View at: Publisher Site | Google Scholar
  3. M. Farrokhnia, H. J. Pijeira-Díaz, O. Noroozi, and J. Hatami, “Computer-supported collaborative concept mapping: the effects of different instructional designs on conceptual understanding and knowledge co-construction,” Computers & Education, vol. 142, Article ID 103640, 2019. View at: Publisher Site | Google Scholar
  4. O. Noroozi and J. Hatami, “The effects of online peer feedback and epistemic beliefs on students’ argumentation-based learning,” Innovations in Education and Teaching International, vol. 56, no. 5, pp. 548–557, 2018. View at: Publisher Site | Google Scholar
  5. A. Valero Haro, O. Noroozi, H. J. A. Biemans, and M. Mulder, “The effects of an online learning environment with worked examples and peer feedback on students’ argumentative essay writing and domain-specific knowledge acquisition in the field of biotechnology,” Journal of Biological Education, vol. 53, no. 4, pp. 390–398, 2019. View at: Publisher Site | Google Scholar
  6. Ü. A. Yücel, J. Gulikers, H. Biemans et al., “Fostering oral presentation competence through a virtual reality-based task for delivering feedback,” Computers & Education, vol. 134, pp. 78–97, 2019. View at: Publisher Site | Google Scholar
  7. A. I. M. Elfeky, “The effect of personal learning environments on participants’ higher order thinking skills and satisfaction,” Innovations in Education and Teaching International, vol. 56, no. 4, pp. 505–516, 2019. View at: Publisher Site | Google Scholar
  8. S. Latifi, O. Noroozi, J. Hatami, and H. J. A. Biemans, “How does online peer feedback improve argumentative essay writing and learning?” Innovations in Education and Teaching International, pp. 1–12, 2019. View at: Publisher Site | Google Scholar
  9. O. Noroozi, A. Weinberger, H. J. A. A. Biemans, M. Mulder, and M. Chizari, “Argumentation-based computer-supported collaborative learning (ABCSCL): a synthesis of 15 years of research,” Educational Research Review, vol. 7, no. 2, pp. 79–106, 2012. View at: Publisher Site | Google Scholar
  10. T. Muir, N. Milthorpe, C. Stone, J. Dyment, E. Freeman, and B. Hopwood, “Chronicling engagement: students’ experience of online learning over time,” Distance Education, vol. 40, no. 2, pp. 262–277, 2019. View at: Publisher Site | Google Scholar
  11. R. Cerezo, M. Sánchez-Santillán, M. P. Paule-Ruiz, and J. Carlos Núñez, “Students’ LMS interaction patterns and their relationship with achievement: a case study in higher education,” Computers & Education, vol. 96, pp. 42–54, 2016. View at: Publisher Site | Google Scholar
  12. K. M. Y. Law, S. Geng, and T. Li, “Student enrollment, motivation, and learning performance in a blended learning environment: the mediating effects of social, teaching, and cognitive presence,” Computers & Education, vol. 136, pp. 1–12, 2019. View at: Publisher Site | Google Scholar
  13. M. Saadatmand, L. Uhlin, M. Hedberg, L. Åbjörnsson, and M. Kvarnström, “Examining learners’ interaction in an open online course through the community of inquiry framework,” Computers & Education, vol. 20, no. 1, pp. 61–79, 2017. View at: Publisher Site | Google Scholar
  14. U. Cakiroglu, “Community of inquiry in web conferencing: relationships between cognitive presence and academic achievements,” Open Praxis, vol. 11, no. 3, pp. 243–260, 2019. View at: Publisher Site | Google Scholar
  15. M. Nazir and N. Brouwer, “Community of inquiry on Facebook in a formal learning setting in higher education,” The Internet and Higher Education, vol. 9, no. 1, p. 10, 2019. View at: Publisher Site | Google Scholar
  16. C. R. Nolan-Grant, “The community of inquiry framework as learning design model: a case study in postgraduate online education,” Research in Learning Technology, vol. 27, 2019. View at: Publisher Site | Google Scholar
  17. D. R. Garrison, T. Anderson, and W. Archer, “Critical inquiry in a text-based environment: computer conferencing in higher education,” The Internet and Higher Education, vol. 2, no. 2-3, pp. 87–105, 1999. View at: Publisher Site | Google Scholar
  18. D. R. Garrison, T. Anderson, and W. Archer, “The first decade of the community of inquiry framework: a retrospective,” The Internet and Higher Education, vol. 13, no. 1-2, pp. 5–9, 2010. View at: Publisher Site | Google Scholar
  19. D. R. Garrison, E-Learning in the 21st Century: A Community of Inquiry Framework for Research and Practice, Routledge/Taylor and Francis, London, UK, 2017.
  20. K. Kozan, “A comparative structural equation modeling investigation of the relationships among teaching, cognitive, and social presence,” Online Learning, vol. 20, no. 3, pp. 210–227, 2016. View at: Publisher Site | Google Scholar
  21. D. R. Garrison, “Cognitive presence for effective asynchronous online learning: the role of reflective inquiry, self-direction, and metacognition,” in Elements of Quality Online Education: Practice and Direction, J. Bourne and J. C. Moore, Eds., pp. 29–38, The Sloan Consortium, Needham, MA, USA, 2003. View at: Google Scholar
  22. A. Adefila, J. Opie, S. Ball, and P. Bluteau, “Students’ engagement and learning experiences using virtual patient simulation in a computer supported collaborative learning environment,” Innovations in Education and Teaching International, vol. 57, no. 1, pp. 50–61, 2020. View at: Google Scholar
  23. H. Ghadirian, A. F. M. Ayub, A. D. Silong, K. B. Abu Bakar, and M. Hosseinzadeh, “Group awareness in computer-supported collaborative learning environments,” International Education Studies, vol. 9, no. 2, p. 120, 2016. View at: Publisher Site | Google Scholar
  24. O. Noroozi, H. J. A. Biemans, H. J. A. Weinberger, M. Mulder, and M. Chizari, “Scripting for construction of a transactive memory system in multidisciplinary CSCL environments,” Learning and Instruction, vol. 25, no. 2, pp. 1–12, 2013. View at: Publisher Site | Google Scholar
  25. Y.-C. Hsu and Y.-M. Shiue, “Exploring the influence of using collaborative tools on the community of inquiry in an interdisciplinary project-based learning context,” Eurasia Journal of Mathematics, Science and Technology Education, vol. 14, no. 3, pp. 933–945, 2018. View at: Publisher Site | Google Scholar
  26. K. Kreijns, P. A. Kirschner, and W. Jochems, “Identifying the pitfalls for social interaction in computer-supported collaborative learning environments: a review of the research,” Computers in Human Behavior, vol. 19, no. 3, pp. 335–353, 2003. View at: Publisher Site | Google Scholar
  27. Ü. A. I. Yücel and Y. K. Usluel, “Knowledge building and the quantity, content and quality of the interaction, and participation of students in an online collaborative learning environment,” Computers and Education, vol. 97, pp. 31–48, 2016. View at: Publisher Site | Google Scholar
  28. M. K. Kim, S. M. Kim, O. Khera, and J. Getman, “The experience of three flipped classrooms in an urban university: an exploration of design principles,” The Internet and Higher Education, vol. 22, pp. 37–50, 2014. View at: Publisher Site | Google Scholar
  29. R. Reiser and D. Salisbury, “Instructional technology and public education in the United States: the next decade,” in Instructional Technology: Past, Present, and Future, G. Anglin, Ed., pp. 254–262, Libraries Unlimited, Englewood, CO, USA, 1995. View at: Google Scholar
  30. P. Shea and T. Bidjerano, “Measures of quality in online education: an investigation of the community of inquiry model and the net generation,” Journal of Educational Computing Research, vol. 39, no. 4, pp. 339–361, 2008. View at: Publisher Site | Google Scholar
  31. C. H. Lawshe, “A quantitative approach to content validity,” Personnel Psychology, vol. 28, no. 4, pp. 563–575, 1975. View at: Publisher Site | Google Scholar
  32. M. R. Lynn, “Determination and quantification of content validity,” Nursing Research, vol. 35, no. 6, pp. 382–386, 1986. View at: Publisher Site | Google Scholar
  33. M. S. B. Yusoff, “ABC of content validation and content validity index calculation,” Education in Medicine Journal, vol. 11, no. 2, pp. 49–54, 2019. View at: Publisher Site | Google Scholar
  34. J. B. Arbaugh, M. Cleveland-Innes, S. R. Diaz et al., “Developing a community of inquiry instrument: testing a measure of the community of inquiry framework using a multi-institutional sample,” The Internet and Higher Education, vol. 11, no. 3-4, pp. 133–136, 2008. View at: Publisher Site | Google Scholar
  35. P. Shea and T. Bidjerano, “Community of inquiry as a theoretical framework to foster “epistemic engagement” and “cognitive presence” in online education,” Computers and Education, vol. 52, no. 3, pp. 543–553, 2009. View at: Publisher Site | Google Scholar
  36. A. A. Tawfik and C. Lilly, “Using a flipped classroom approach to support problem-based learning,” Computers & Education, vol. 20, no. 3, pp. 299–315, 2015. View at: Publisher Site | Google Scholar
  37. L. Lipponen, “Exploring foundations for computer-supported collaborative learning,” in Foundations for a CSCL Community, pp. 72–81, International Society of the Learning Sciences, Boulder, CO, USA, 2002. View at: Google Scholar
  38. R. F. Branon and C. Essex, “Synchronous and asynchronous communication tools in distance education,” TechTrends, vol. 45, no. 1, p. 36, 2001. View at: Publisher Site | Google Scholar
  39. K. J. Topping, “Peer assessment,” The Internet and Higher Education, vol. 48, no. 1, pp. 20–27, 2009. View at: Publisher Site | Google Scholar
  40. J. Hattie and H. Timperley, “The power of feedback,” Review of Educational Research, vol. 77, no. 1, pp. 81–112, 2007. View at: Publisher Site | Google Scholar
  41. L. Rourke, T. Anderson, D. R. Garrison, and W. Archer, “Assessing social presence in asynchronous text-based computer conferencing,” The Journal of Distance Education, vol. 14, no. 2, pp. 50–71, 1999. View at: Google Scholar
  42. T. Anderson, L. Rourke, R. Garrison, and W. Archer, “Assessing teaching presence in a computer conferencing context,” Online Learning, vol. 5, no. 2, 2019. View at: Publisher Site | Google Scholar
  43. C. L. Park, “Replicating the use of a cognitive presence measurement tool,” Journal of Interactive Online Learning, vol. 8, no. 2, pp. 140–155, 2009. View at: Google Scholar
  44. H. Chen and C. Chang, “Integrating the SOP2 model into the flipped classroom to foster cognitive presence and learning achievements,” Educational Technology & Society, vol. 20, no. 1, pp. 274–291, 2017. View at: Google Scholar
  45. L. E. Herrera Díaz and D. González Miy, “Developing the oral skill in online English courses framed by the community of inquiry,” PROFILE Issues in Teachers’ Professional Development, vol. 19, no. 1, p. 73, 2017. View at: Publisher Site | Google Scholar
  46. C.-M. Chiu, S.-Y. Sun, P.-C. Sun, and T. L. Ju, “An empirical analysis of the antecedents of web-based learning continuance,” Computers & Education, vol. 49, no. 4, pp. 1224–1245, 2007. View at: Publisher Site | Google Scholar
  47. A. Darabi, M. C. Arrastia, D. W. Nelson, T. Cornille, and X. Liang, “Cognitive presence in asynchronous online learning: a comparison of four discussion strategies,” Journal of Computer Assisted Learning, vol. 27, no. 3, pp. 216–227, 2011. View at: Publisher Site | Google Scholar
  48. H. Kanuka, L. Rourke, and E. Laflamme, “The influence of instructional methods on the quality of online discussion,” British Journal of Educational Technology, vol. 38, no. 2, pp. 260–271, 2007. View at: Publisher Site | Google Scholar
  49. Z. Akyol and D. R. Garrison, “Understanding cognitive presence in an online and blended community of inquiry: assessing outcomes and processes for deep approaches to learning,” British Journal of Educational Technology, vol. 42, no. 2, pp. 233–250, 2011. View at: Publisher Site | Google Scholar
  50. D. R. Garrison, T. Anderson, and W. Archer, “Critical thinking, cognitive presence, and computer conferencing in distance education,” American Journal of Distance Education, vol. 15, no. 1, pp. 7–23, 2001. View at: Publisher Site | Google Scholar
  51. D. R. Garrison and J. B. Arbaugh, “Researching the community of inquiry framework: review, issues, and future directions,” The Internet and Higher Education, vol. 10, no. 3, pp. 157–172, 2007. View at: Publisher Site | Google Scholar
  52. K. Swan and P. Shea, “The development of virtual learning communities,” in Asynchronous Learning Networks: The Research Frontier, S. R. Hiltz and R. Goldman, Eds., pp. 239–260, Hampton Press, New York, NY, USA, 2005. View at: Google Scholar
  53. J. A. Maddrell, G. R. Morrison, and G. S. Watson, “Presence and learning in a community of inquiry,” Distance Education, vol. 38, no. 2, pp. 245–258, 2017. View at: Publisher Site | Google Scholar
  54. A. Sadaf and L. Olesova, “Enhancing cognitive presence in online case discussions with questions based on the practical inquiry model,” American Journal of Distance Education, vol. 31, no. 1-2, pp. 56–69, 2017. View at: Publisher Site | Google Scholar
  55. J. Weidlich and T. J. Bastiaens, “Explaining social presence and the quality of online learning with the SIPS model,” Computers in Human Behavior, vol. 72, pp. 479–487, 2017. View at: Publisher Site | Google Scholar
  56. M. B. Horzum, “Interaction, structure, social presence, and satisfaction in online learning,” Eurasia Journal of Mathematics, Science and Technology Education, vol. 11, no. 3, pp. 505–512, 2017. View at: Publisher Site | Google Scholar
  57. J. C. Richardson, Y. Maeda, J. Lv, and S. Caskurlu, “Social presence in relation to students’satisfaction and learning in the online environment: a meta-analysis,” Computers in Human Behavior, vol. 71, pp. 402–417, 2017. View at: Publisher Site | Google Scholar
  58. P. Shea, S. Hayes, J. Vickers et al., “A reexamination of the community of inquiry framework: social network and content analysis,” The Internet and Higher Education, vol. 13, no. 1–2, pp. 10–21, 2010. View at: Publisher Site | Google Scholar
  59. S. Geng, K. M. Y. Law, and B. Niu, “Investigating self-directed learning and technology readiness in blending learning environment,” International Journal of Educational Technology in Higher Education, vol. 16, no. 1, 2019. View at: Publisher Site | Google Scholar
  60. C. Roddy, D. L. Amiet, J. Chung et al., “Applying best practice online learning, teaching, and support to intensive online environments: an integrative review,” Frontiers in Education, vol. 2, 2017. View at: Publisher Site | Google Scholar
  61. M. Zhu, S. C. Herring, and C. J. Bonk, “Exploring presence in online learning through three forms of computer-mediated discourse analysis,” Distance Education, vol. 40, no. 2, pp. 205–225, 2019. View at: Publisher Site | Google Scholar
  62. O. Noroozi, H. Biemans, and M. Mulder, “Relations between scripted online peer feedback processes and quality of written argumentative essay,” Internet and Higher Education, vol. 31, pp. 20–31, 2016. View at: Publisher Site | Google Scholar

Copyright © 2020 Abbas Taghizade et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

More related articles

 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder

Related articles

We are committed to sharing findings related to COVID-19 as quickly as possible. We will be providing unlimited waivers of publication charges for accepted research articles as well as case reports and case series related to COVID-19. Review articles are excluded from this waiver policy. Sign up here as a reviewer to help fast-track new submissions.