About this Journal Submit a Manuscript Table of Contents
Education Research International
Volume 2012 (2012), Article ID 702315, 11 pages
http://dx.doi.org/10.1155/2012/702315
Research Article

Online Measurement Perspectives for Students’ Strategy Use: Tool Use within a Content Management System

Centre for Instructional Psychology and Technology, Katholieke Universiteit Leuven, Dekenstraat 2, P.O. Box 3770, 3000 Leuven, Belgium

Received 6 December 2011; Revised 16 March 2012; Accepted 31 March 2012

Academic Editor: Eduardo Cascallar

Copyright © 2012 Griet Lust et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

The current study investigates whether students’ tool use in a Content Management Systems (CMSs) for example, blackboard, WebCT, relates to students’ strategy use within a CMS-supported course. In this way, the study strives to explain the tool use differences, and furthermore, it strives to find (online) behavioral indications for students’ strategy use. Data were collected within a first-year undergraduate course “Learning & Instruction”. Students ( ) reported their strategy use as measured through the Inventory of Learning Styles (Vermunt, 1998). Students’ tool use within the CMS was logged. K-means cluster analyses revealed four clusters that were characterized by different strategies and accompanying tool use. Specifically, two clusters revealed a tool use pattern that was in line with the behavioral profile of students’-reported strategy use. Interestingly, two clusters revealed a tool use pattern that was in contrast to the reported strategy use. These results raise questions with respect to students’ tool perceptions and students’ calibration capacities.

1. Introduction

Content management systems (CMSs) such as Blackboard and WebCT are increasingly popular in today’s higher education as a means of providing online courses or as a part of a blended course wherein face-to-face and CMSs support is provided [1, 2]. The popularity of CMSs can be ascribed to its’ features that are claimed to be beneficial for students’ learning. Specifically, CMSs provide a rich toolset with various learning support to students, and CMSs give students control over using these tools. Since these features allow students to select learning opportunities in line with their own learning needs, it is claimed that enriched, self-regulated, and motivated learning is stimulated [3, 4].

Despite these wide-spread arguments in favor for CMSs, evidence on students’ tool use within CMSs indicates that this learner control in tool use cannot be taken for granted. Multiple studies revealed differences in using CMS tools (e.g., [59]). Moreover, these tool-use differences had in most studies significant performance effects which imply that not every student profited from the learning opportunities CMSs provide. Clearly, this evidence demonstrates that the availability of a rich toolset does not guarantee that tools will be used in ways as intended by the designer. In this respect, Perkins [10] and Winne [11] stress that tool use is a mindful process and not a passive reaction on instructional stimuli such as tools. Specifically, adaptive tool use, or using tools in line with the learning needs, presupposes that (a) students perceive the learning content and tools in a similar way as the designers/teachers, (b) students are skillful in using the tools as intended, and (c) students are motivated to spend effort and time in using these tools [10, 11]. Consequently, adaptive tool use seems to be a metacognitive and motivational governed behavior (cf. conditions Perkins and Winne), wherein learners adaptively regulate their tool use in line with the learning tasks. In this way, adaptive tool use can be considered a self-regulated study strategy.

Consequently, tool-use differences in current CMS research could reflect different strategy choices among students due to differences in the underlying conditions as stated by Perkins [10] and Winne [11]. Nevertheless, current CMS research merely observed students’ tool use, the differences, and how these differences impact students’ learning. The strategy-related reasons remain still unclear. The current study addresses this concern. Particularly, the current contribution addresses the main hypothesis that profiles reflecting different tool and strategy use among students can be encountered. Tool-use differences can be related theoretically to differences in strategy use by comparing the behavioral characteristics of different study strategies with the different tool functionalities within CMSs. The following section elaborates on these hypotheses.

2. Hypotheses: Behavioral Profiles of Strategy Use

Tools that are provided through a CMS can be categorized dependent on the kind of support they provide for the learning process. Table 1 depicts the different tool categories together with their objective functionality for the learning process [12]. In general, Table 1 illustrates a broad distinction between (a) information tools as tools that provide information in a structured or elaborated way, (b) cognitive tools as tools that allow interaction with the learning content, and (c) scaffolding tools as tools that guide students’ learning processes. Additionally, different types of cognitive tools are distinguished dependent on the object of interaction. Different types of scaffolding tools are distinguished dependent on the object of scaffolding.

tab1
Table 1: Tool functionalities.

Research and theory on students’ study strategies find its origin in the work of Marton and Säljö’s [13], initial distinction between surface and deep approaches of learning. These two approaches describe qualitative differences in how students process a specific learning task. In addition to these differences in students’ processing strategies, Vermunt [14] incorporates students’ regulation strategies as well in his model on students’ study strategies. According to Vermunt [15], a study strategy encompasses regulation and processing strategies. In this respect, the model of Vermunt is applicable to the current study given the fact that adaptive tool use, as using tools in relation to the learning needs, is assumed to be a self-regulated process [10, 11].

Based on a phenomenographic analysis of students’ study strategies, Vermunt [14] distinguished, in line with Marton and Saljö, two kinds of processing strategies: surface and deep processing strategies. Surface processing strategies are characterized by learning activities such as rote memorization and repetition and reflect a focus on recall and reproduction [13, 14]. Deep processing strategies are characterized by learning activities such as relating ideas and seeking evidence and reflect an intention to understand what is being taught [13, 14]. Regulation strategies are directed at regulating cognitive processing. In this respect, three kinds of regulation strategies were distinguished (a) self-regulation, (b) external regulation, and (c) lack of regulation [14]. In theory, a students’ study strategy can combine any processing strategy with any regulation strategy. However, data gathered with the Inventory of Learning Styles of Vermunt [15] established four study strategies that are empirically dominant; (a) a self-regulated and deep oriented study strategy, (b) an externally and surface-oriented study strategy, (c) an unregulated study strategy, and (d) an application-directed study strategy [1720]. The behavioral profile of each of these study strategies as derived from the phenomenographic analysis of Vermunt [16] and as measured through the Inventory of Learning Styles [15] is used to set the hypotheses. Although the strategies “external regulation” and “analyzing” as measured through the Inventory of Learning Styles [15] are rather weak (Cronbach’s alfa for regular students range between  .48 and  .68) they will be incorporated in the current study for reasons of completeness. However, their reliability will be checked again.

Students with an external regulated and surface-oriented study strategy process the subject matter in a superficial way by analyzing the content on details and memorizing and rehearsing this information [14, 15]. These students regulate their learning externally which implies according to Vermunts’ model [15] that they use the devices that are provided by instruction. Although this argument counts for support devices that are integrated into the content and tasks (embedded devices), it is questionable if this argument counts for tools, which are nonembedded devices [21]. In the case of embedded devices the support can hardly be avoided once a learner engages with the content and the learning tasks [21]. In the case of nonembedded devices or tools, learners have to decide when and how they will use these tools. Hence, using tools presupposes self-regulation skills [21] such as metacognitive monitoring and control [22]. Given this difference, it is expected that only self-regulated learners will use the tools in a way as intended (cf. infra). External regulated learners on the other hand will use a limited amount of tools and will use them in a superficial way. Given their focus on memorizing and rehearsing, these students will select tools that are related to the examined subject matter such as basic information tools that structure the content that has to be learned. Furthermore, they will avoid tools that elaborate their knowledge, that are elaborated information tools and tools that induce higher order learning, that is, cognitive tools.

Students with a self-regulated, deep oriented study strategy process the subject matter in a deep manner by relating and structuring the subject matter and by critically processing the information. These students are focused on connecting different parts of the learning content and organizing these parts into an organized whole [14]. Additionally, these students aim to do something with the content material in order to process this material critically [14]. In addition to these processing strategies, students self-regulate their learning which implies that (a) they prepare their learning process, (b) they diagnose if the learning activities progress into an intended direction, (c) they change the original plan if necessary, and (d) they evaluate and reflect on their learning process [14, 15]. In line with this profile, it can be expected that these students will use the available CMS tools in an adaptive way since these tools structure, elaborate, and allow critical reflection of the subject matter. Particularly, students will use cognitive processing tools in order to organize the information into a whole and to prepare their learning. Furthermore, these students will use basic information tools since these tools allow them to orient themselves to the main structure of the subject matter. Moreover, they will also use elaborated information tools since these tools allow them to relate new information to their existing knowledge. In order to diagnose their learning process these students will use the knowledge modeling tools frequently and intensively (longer) since they aim to focus on their state-of-the art knowledge in relation to the course objectives. In order to critically process the information and to monitor their learning, communication tools will be used actively, that is, posting messages since this allows students to reflect on the subject matter, to monitor their understanding, to ask for help if necessary, and to reflect on the current understanding.

Students with an undirected study strategy lack regulation of their learning processes. These students have difficulties with regulating their learning themselves. Furthermore, these students have difficulties with regulating their learning externally, that is, although they are strongly focused on the available learning support it remains unclear for them how these devices can support their learning. Furthermore, these students are strongly focused on factual information, they avoid questions and assignments that deal with comprehension and application [14]. In line with this study strategy, it can be expected that these students will use most of the available tools within a course. However, they will use them in a superficial way in contrast to students with a self-regulated, deep oriented study strategy. Specifically, they will use the knowledge modeling tools shortly since they are mainly focused on the questions and the correct answers. Moreover, they will avoid knowledge modeling tools that deal with comprehending and applying the course content since these tools are less straightforward. In addition to this, these students will use the communication tools passively, that is, they will merely read messages. In this way, the discussion board will not be used to reflect and discuss the course content. Furthermore, they will use the conceptual scaffolding tools frequently and intensively since these tools provide guidance on what to consider.

Finally, an application-directed study strategy is focused on processing the subject matter in a concrete way. Specifically, these students seek the relevance of the subject matter in the outer world. Furthermore, these students reflect on the subject matter in terms of personal relevance [14, 15]. Given this processing strategy, it can be expected that they will use information tools that elaborate on the course content with practical examples since these tools allow learners to make connections with the outer world. Additionally, it can be expected that these students will use the communication tools actively by posting content-related messages. This tool use allows learners to reflect on the course content in terms of personal relevance.

3. Method

3.1. Participants

Participants were first-year Educational Sciences undergraduates ( ) at the K.U.Leuven. There were 93% woman and 7% men. Most of the students (63%) were 18 years old. The distributions in gender and age represent the demographics of the whole cohort of and are typical for Flemish Educational Sciences courses.

3.2. The Blended Course
3.2.1. Content

At the K. U. Leuven, “Learning and Instruction” is a first-year bachelor course at the department of Educational Sciences. The course is a theoretical course that gives an introduction into the field of educational sciences. The course content consists out of two parts: a theoretical introduction into the main concepts and an expansion of these concepts. The last part consists out of different scientific contributions that relate to the main concepts.

3.2.2. Learning Tasks

Students’ learning was assessed by an exam and an assignment. The exam consisted out of three parts that measured different aspects of students’ learning. The first part “the factual items” contained items wherein students had to reproduce their knowledge of the course content. The second part “the comprehension items” consisted out of items that forced students to relate different aspects of the course content to each other. The last part “the application items” required students to interpret specific situations in terms of the course content. The assignment consisted out of an educational proposition. For this assignment, students had to apply their content knowledge into critical arguments regarding their agreement or disagreement with the proposition.

3.2.3. Tools

Additional to the syllabus, a series of lectures, a team of support staff and a CMS were at students’ disposal. The lectures provided the course content in a structured way and hence aimed at supporting students’ information retrieving needs. The support staff organized three learning support sessions that students could attend voluntarily. Students could also prepare themselves (voluntarily) to the sessions by making a preparatory assignment before each session.

The CMS was version 9 of Blackboard. The access and the use of the learning environment were under student control. A variety of learning support was available on the course. First of all, there was administrative information about the course (e.g., course info, the announcements). Second, different information tools were available (e.g., the course material outlines and the web lectures) that provided the content in a structured way and thus strived to support students’ information retrieving needs. Third, information tools that elaborated on the course content were available (e.g., web links). They provided different content applications and hence supported students in comprehending and applying the course content. Fourth, knowledge-modeling tools were available (i.e., practice quiz and exercises) that allowed students to reflect on their knowledge and on the course requirements. Fifth, there was an opportunity for collaboration and communication with peers, instructor, and the course content, that is, discussion board. The discussion board supported students in critically reflecting and comprehending the course content. Sixth, there was a planning available that revealed how knowledge and learning were structured throughout the course. Finally, there were some conceptual scaffolds (e.g., study tips, feedback on practice quizzes) that guided students’ attention and gave metacognitive feedback with respect to study strategies.

3.3. Measurement Instruments
3.3.1. Tool Use

Information on use of the digital (CMS) tools was collected through logging students’ actions in the Blackboard course. Table 2 presents the tool-use variables that were used in this study.

tab2
Table 2: Tool-use variables.
3.3.2. Study Strategies

Students’ study strategies were measured with the Inventory of Learning Strategies (ILSs) of Vermunt [15]. The ILS includes four domains of learning, two of which were considered relevant: processing strategies and regulation strategies (together they define a study strategy). The selected part of the instrument consisted out of 50 items, all of which described learning activities. Normally, the ILS asks students to report about their usual way of learning [16]. For the study’s purpose, the instructions asked students to indicate the learning strategies adopted in the course “Learning and Instruction”. Specifically, students had to indicate the degree to which they agreed with a given strategy on a six-point Likert scale (1 = totally disagree; 6 = totally agree).

3.4. Procedure

Students’ tool use was logged throughout the course “Learning & Instruction” starting at the moment the course started until the final course exam. Students’ study strategies were surveyed during the third lecture of the course. At this point, students had already some experience with the course and its’ requirements which allowed them to estimate their study strategies more accurately than at the very beginning of the course.

3.5. Quality of Analyses

Quality of analyses was guaranteed by investigating the reliability of Vermunts’ questionnaire. Reliability refers to the extent to which the constructs are free from error and yield consistent results [23]. In this study, Cronbach’s alpha was used to measure the internal consistency of the different strategies. The Cronbach’s alphas for the different strategies are presented in Table 3. Results reveal that the items measuring “analyzing”, “external regulation: process,” and “external regulation: result” are not reliable since their Cronbach’s alpha is below the threshold of  .70 [23]. As indicated in the theoretical framework, these constructs also had weak alpha values in the research of Vermunt [15]. Therefore, they will be excluded from the further analyses.

tab3
Table 3: Reliability processing and regulation strategies.

The validity of Vermunts’ questionnaire was a part of the main analysis, that is, cluster analysis, as depicted below.

3.6. Analysis

K-means cluster analysis was performed in R on the (standardized) tool use and study strategy variables in order to investigate if student differences in tool and strategy use reflect distinct profiles. K-means cluster solutions with two to ten clusters were fitted using 1000 restarts (for a discussion of the use of K-means cluster analysis, see [24]). The optimal cluster solution was selected based on the scree plot in which the number of clusters is plotted against the sum of squared residuals. This measure decreased monotonically, but at some point this decrease, flattened markedly. The location of such an “elbow” was selected as the appropriate cluster solution [25].

The different clusters were further defined in terms of the study strategies and the tool use that they represent. In this respect, a multivariate analysis of variance with the cluster solution as independent and the clustered variables as dependent variables was performed. Particularly, results from the Pairwise comparisons were used in order to define the different clusters. These cluster profiles were used to investigate the study’s hypotheses.

4. Results

4.1. Descriptive Statistics

Table 4 presents the descriptive statistics of tendency and distribution of students’ tool use. Tendency values indicate that tools were not very often used. Nevertheless, the standard deviations are in most cases very high which indicates large individual differences among students. Furthermore, data for a majority of the tool-use variables is highly and positively skewed which implies that the bulk of values lies to the left of the mean. Consequently, most students had tool-use values that were lower than the means. Clearly, the descriptive statistics indicate that tools were hardly by students.

tab4
Table 4: Tool use descriptive.

Table 5 depicts the values of tendency for each processing and regulation strategy. Descriptives reveal that students report on average processing strategies such as relating and structuring, memorizing and rehearsing, and concrete processing. On average, students report less self-regulation strategies, lack of regulation strategies, and critical thinking strategies.

tab5
Table 5: Study strategy descriptive.
4.2. Strategy-Related Bases of Students’ Tool Use

A four-cluster solution was selected after two K-means cluster analyses. At a four-cluster solution a slight elbow could be seen in the scree plot. This elbow indicates the most optimal cluster solution wherein the distances between the cluster centroids over the distances between the cluster members were at their maximum.

Multivariate analysis of variance, with the four-cluster solution as independent and the clustered variables as dependent variables revealed that these four clusters had a different profile, Wilks’ λ=  .03, F(78, 458) = 13.47, , η² =  .70. Table 6 depicts the cluster centroids for the four different clusters together with the results of the tests of between subjects ( -level,  .05). For some tools, the univariate tests were not significant, indicating that all clusters did not differ significantly on this particular tool. In some cases, however, for example, weblectures (timing), post-hoc comparisons still reveal some significant differences among particular clusters. In these cases, this information is given in the last column.

tab6
Table 6: Cluster profiles.

Students in cluster 1 ( ) scored significantly lower on the different processing and regulation strategies, , than the other clusters in the post-hoc tests. Students in cluster 1 did not use the discussion board and the web links in comparison with students in cluster 4, . Except for the web lectures, they did not use the other tools in comparison with students from the other clusters, .

Students in cluster 2 ( ) scored significantly higher for lack of regulation in comparison with the other clusters, . In comparison with students in cluster 1, students in cluster 2 used the course outlines, the study support, the first-practice quiz, and the feedback on the first-practice quiz more frequently, . Furthermore, they used the feedback on the first-practice quiz longer in comparison with students in cluster 1, . In comparison with students in cluster 1 and cluster 3, they read more messages on the discussion board and used the learning support and the second-practice quiz more frequently, . Additionally, students in cluster 2 watched the web lectures for a shorter time but watched them more in a nonstop way in comparison with cluster 3, . Students in cluster 2 did not use the web links and did not post messages on the discussion board in comparison with students in cluster 4, . Finally, students in cluster 2 spent most time in the feedback on the first-practice quiz.

Students in cluster 3 ( ) scored significantly higher on relating and structuring, on critical thinking, on concrete processing, and on self-regulation than students in cluster 2 and cluster 1, . Furthermore, they scored significantly higher on memorizing and rehearsing than all the other clusters, . Students in cluster 3 accessed the course outlines more than students in cluster 1, . Students in cluster 3 watched the web-lectures for a longer time and watched them more strategically in comparison with cluster 2, . Students in cluster 3 did not use the web links, the communication tools, and the planning in comparison with students in cluster 4, . Additionally, they accessed the two-practice quizzes less frequent than students in cluster 4, . Although they used the study support and the learning support more than student in cluster 1, they did not use them as frequently as students in cluster 2 and cluster 4, . Finally, they did not use the feedback on the practice quiz in comparison with students in cluster 2 and cluster 4.

Students in cluster 4 ( ) scored significantly higher on critical thinking, on concrete processing, and on self-regulation than students in cluster 2 and cluster 1, . Students in cluster 4 used the web links (frequency and timing), the discussion board, and the planning significantly more frequent and intense than the other clusters, . Additionally, students in cluster 4 used the learning support, the study support, the feedback on quiz 1, and the practice quizzes more frequently than students in cluster 1 and cluster 3, . Descriptives reveal that they used the second-practice quiz longer and the feedback on quiz 1 shorter than students in cluster 2.

5. Discussion

This study investigated the strategy-related bases of students’ tool use within a CMS supported course. Following the model as depicted by Perkins [10] and Winne [11], it was argued that tool use is possibly an observable trace of students’ strategy use since it is an informed decision based on (a) students’ interpretation of the course context, (b) students’ tool-use skills, and (c) students’ motivation. The results of this study are in line with this argument; four different profiles were revealed that reflected different tool and strategy use among students.

Moreover, two clusters revealed a tool-use pattern that was associated with their reported strategies. Specifically, students in cluster 2 reported a lack of regulation strategies. According to Vermunt [14, 15], these students have difficulties with interpreting what is expected from them. Therefore, they are strongly focused on the support that is provided by instruction although they are not able to use them in a meaningful way. Moreover, they are directed towards factual information and look for “hints” within the support [14]. Students in cluster 2 revealed a tool-use pattern that reflected this behavioral profile. Specifically, students in cluster 2 accessed all the available tools except the web links. It is not a surprise that students in cluster 2 ignored the web links since these tools elaborated on the course content. Hence, web links would distract students from their focus on factual information. Although students in cluster 2 used all the other tools, they merely used them in a superficial way. Specifically, the web lectures were used shortly and mainly in a nonstop way. Students merely started the web lectures and quickly stopped it which indicates that the web lectures were not used for specific learner-related goals. Furthermore, students were mainly passive users of the discussion board, that is, they read messages without contributing to the ongoing discussion. Clearly, these students did not use the discussion board as a tool to critically reflect and discuss on the course content. Students in cluster 2 used the feedback on the first-practice quiz very intensively which reflects that they were looking for exam hints instead of reflecting on their deficits in knowledge. In contrast to the first-practice quiz, these students used the second-practice quiz only shortly. The fact that the second-practice quiz consisted out of open questions that were less straight forward could explain why these students did not put effort in using them. Hence, although students in cluster 2 were directed towards the available tools, their superficial tool use reflects that they were not sure how to use them adequately which reflects their lack of regulation. Consequently, students in cluster 2 can be labeled as the disorganized students.

Students in cluster 4 reported concrete processing, critical thinking and self-regulation strategies, three components of a self-regulated and deep oriented study strategy [15]. Furthermore, these students revealed the expected tool-use pattern that was associated with a self-regulated and deep oriented study strategy. Specifically, self-regulated and deep oriented students are directed towards a deep processing of the course content by relating and structuring the different parts, by looking for practice applications, and by critically reflecting on the course content [14, 15]. Therefore they will consult nonprescribed literature to deepen their interest, to better understand the topic, and to broaden their view. Furthermore, they will ask questions about the course content and will assess their knowledge frequently [14]. In line with this profile, students in cluster 4 accessed all the available tools and used them in a meaningful way in contrast to cluster 2 (cf. supra). Particularly, students in cluster 4 used the web links frequently and intensively since these tools allowed them to broaden their understanding of the course content. Students in cluster 4 used the discussion board also actively, that is, they posted content-related messages on the forum. In this way, these students discussed the course content and hence processed the material critically. In line with this critical processing, students in cluster 4 used the two-practice quizzes frequently and intensively. Hence, they assessed and reflected on their knowledge and on the course objectives. As a tool for self-regulated learning, students in cluster 4 consulted the planning more frequently than the other students. In this way, they were able to have an overview over the way content was built up throughout the course. Finally, students in cluster 4 watched the web lectures in a strategic way which involves that they used these information tools for specific goals. In addition to these differences with cluster 2, these students used the scaffolding tools and the course outlines. Consequently, cluster 4 reflects a group of self-regulated and deep oriented students.

Nevertheless, results are also inconsistent with the study’s hypotheses. Two other clusters were retrieved wherein students’ tool use did not match their reported study strategy. Most of the students were part of cluster 1. Students in cluster 1 did not report a specific study strategy nor did they engage in a particular tool use except for the web lectures. This cluster does not reflect a clear profile in terms of tool and strategy use. Therefore, students from cluster 1 are labeled the undefined students. These undefined students neglected the available tools which correspond to findings from multiple studies that investigated students’ tool use in computer-based learning environments. In multiple studies, tools were not used when students had control over using them (see review [26]). Again, this evidence stresses that providing a toolset will not guarantee its’ use. Interestingly, the current study revealed that this group of no-users had no specific preference regarding their study strategies within the course “Learning and Instruction”. Our study, therefore, identifies a group of students without any preferences. This is in clear contrast with the other studies using the ILS of Vermunt (e.g., [1720]).

Students in cluster 3 reported a memorizing and rehearsing strategy, a component of a surface-oriented study strategy, together with a self-regulated and deep oriented study strategy such as cluster 4. According to Vermunt’s model [15], this combination is not necessarily a problem since self-regulated and deep oriented students can also engage in “lower” order strategies such as memorizing and rehearsing. Surface-oriented students are, however, restricted to these “lower” order strategies [15]. Consequently, students from cluster 3 should also engage in deep-oriented strategies. Nevertheless, students’ tool use in cluster 3 reflected a surface-oriented study strategy. Particularly, surface-oriented students are highly selective and restricted to the examined subject matter [14]. In line with this orientation, it was found that students in cluster 3 were selective in their tool use. Moreover, students in cluster 3 used tools with an explicit link to the face-to-face context, that are, the course outlines, the web-lectures, the study support, and the learning support. Specifically, the outlines structured the face-to-face lectures and the web lectures recorded them. The study support and the learning support were related to the face-to-face support sessions. Additionally, students in cluster 3 used the two-practice quizzes although their use was very short, possibly too short to speak of a meaningful use. This tool use is again in line with the behavioral profile of surface-oriented students [14]. Surface-oriented students engage superficially in learning tasks and in practice quizzes, they mainly check if they can answer the questions without actually doing so [14]. Hence, results also revealed that although students in cluster 3 reported similar study strategies as students in cluster 4, they revealed an inconsistent tool-use pattern. Moreover, they revealed a tool-use pattern that reflected a surface-oriented study strategy despite the fact that these students reported mainly a self-regulated and deep oriented study strategy. Consequently, students from cluster 3 are labeled the inconsistent students. However, how can this inconsistency be explained?

A first explanation could be the grain size of the questionnaire that caused this inconsistency. Particularly, the questionnaire asked students to indicate their study strategies within the course “Learning and Instruction”. Courses are, however, not one dimensional, they include various learning tasks that can possibly trigger other study strategies [27, 28]. As for “Learning and Instruction”, students had to make a multiple-choice exam, an open book exam, and an assignment wherefore they had to write an essay. Given these different learning tasks, it is possible that students changed their initial study strategies as a function of each learning task. This temporal development could explain why inconsistent students did not reveal a tool-use pattern that was in line with what they initially reported. It is possible that students changed their strategy use throughout the course from a self-regulated, deep oriented study strategy to a surface-oriented study strategy. Empirical evidence supports this explanation. In a study of Hadwin et al. [29] it was found that students’ self-report of study strategies differed as a function of the tasks within the course. Nevertheless, even if the inconsistent students have changed their initial strategies throughout the course, some behavioral residue of their initial strategy should have been noticed in the log data. Particularly, since the study collected log files from the beginning until the end of the course. However, no behavioral residue could be found in the data, hence this first explanation is possibly not valid.

A second explanation could be that the inconsistent students had misfitting instructional conceptions regarding the tool’s functionality in comparison with self-regulated and deep oriented students. As stated in the introduction, tool use presupposes also students’ ideas about the tool functionalities [10, 11]. The latter refers to the cognitive mediation paradigm that postulates that students do not simply react to objective instructional stimuli but to interpreted stimuli [3033]. Consequently, students’ ideas about the instructional stimuli will define the kind of behavior students reveal [34]. In line with this paradigm, multiple studies have revealed that students’ instructional ideas influence students’ learning activities in general [35, 36] and students tool use in particular [3739]. In line with this rational, it is hence possible that the inconsistent students did not use the toolset because they did not perceive the available tools as supportive for critical thinking, for concrete processing, and for relating and structuring. Possibly, they used thus other tools in order to process the material in a self-regulated and deep way for example a social network site to discuss, Google to look for elaborated information, and graduate students to assess and reflect on their knowledge.

A final explanation relates to students’ capacity to report accurately their strategy use within a course. Inconsistent students possibly overestimated their self-regulated and deep oriented study strategies given the fact that their tool use reflects another study strategy. Already in the 90s, Collopy [40] revealed that managers’ self-reported use of the computer did not reflect their use as tracked through log files. Managers tended to report more use of the computer than they actually did. According to the researcher, this result could be generalized to all kinds of self-reports which are characterized by a structural response bias. The latter refers to the sensitivity of respondents to the cultural norms behind the questionnaire which leads to socially desired responses [40]. Empirical evidence for his statement can be found in the field of educational sciences as well. In a twofold study of Winne and Jamieson-Noel [22, 41] students’ calibration between their self-reports and their actual study tactics within an online learning environment was investigated. Results were conclusive that students tended to report more study tactics than they actually used within the learning environment [22, 41]. The researchers concluded that students were not very accurate at calibrating their thoughts about their actions and their actual actions. A review study of Dunning et al. [42] also indicated that in most studies students tend to overestimate their achievement and learning skills. Clearly this amount of evidence suggests that the third explanation is highly possible as well.

The groups of undefined and inconsistent students call for more research. First of all, more research is necessary that replicates the current study in order to find out if similar profiles can be retrieved in other CMS-supported courses. The existence of similar patterns would strengthen our results in terms of external validity. Secondly, future research needs to address the empirical validity of the explanations that were given for the groups of undefined and inconsistent users. With respect to the undefined users, future research is needed to find out whether these students were not conscious regarding their strategy use instead of lacking strategy use. As Lompscher [43] pointed out, questionnaires measure strategies and behavior merely at the reflective instead of the action level. With respect to the inconsistent users, it was suggested in the first place that students’ ideas regarding the CMS tools could possibly explain their tool use. Future research should validate this explanation by investigating how students’ instructional ideas on CMS tools influence tool use within CMSs. More insight would be meaningful for instructional design. Particularly, it would provide cues in order to adapt CMSs to students’ misconceptions regarding the tool functionalities. In a second instance, it was suggested that inconsistent students were not accurate in reporting their strategy use. In order to empirically validate this explanation, further analyses are necessary on students’ response patterns within the ILS. If it would be the case that inconsistent students overestimated their self-regulated and deep oriented study strategies, then these results could trigger an interesting discussion about the value of self-reports in strategy use. Particularly, one can wonder how valid these questionnaires are in order to investigate students’ learning behavior? In what way do these questionnaires reflect either students’ behavior or students’ cognition regarding their behavior [22]? Additionally, in what way is miscalibration between students’ cognition and behavior important for students’ learning effectiveness? Further research should address these questions.

The results must be interpreted in light of some limitations. The current study should be mainly interpreted as an attempt to explore and link behavioral indicators, that is, CMS tool-use with reported processing strategies. As suggested above, the generalizability of the study is limited as it was conducted at a single institution and upon students of a single course. Hence, it remains unclear if similar clusters of study strategies and tool use can be expected among other courses with other student populations. Secondly, the current contribution captured students’ study strategies and tool use as two stable things by measuring both at one time point. Students’ study strategies were measured at the beginning of the course, hence it was not clear from the current data how students changed their strategy reports throughout the course. With respect to students’ tool use, log files retrieved an unobtrusive insight into the tools that students used and the way these tools were used. Nevertheless, the moment tools were selected and used throughout the course remains unclear. A final limitation relates to the learning effects of students’ profile. The current study did not investigate whether the different profiles in terms of tool and strategy use affected students’ learning. In addition to existing CMS research, the current study focused merely on students’ tool use differences within CMSs and the strategy-related bases of these tool use differences.

Despite these limitations, the current contribution has some interesting implications for instructional design and study strategy research. For study strategy research, our results provide interesting methodological perspectives. The fact that different study strategies were reflected in distinct observable traces, that is, disorganized, self-regulated, and deep oriented students, provides possibilities for measuring students’ study strategies with behavioral measurement techniques. The benefit of these kind of measurement techniques is that they are unobtrusive, that is, students’ study strategies are captured without interfering into students’ ongoing behavior. From an instructional design perspective, these results stress that it cannot be assumed that a rich toolset will be automatically used in ways as intended by their designers. Although there are theoretical arguments in favor of learner control in tool use [44, 45], this study revealed that there is a group of students, that is, the disorganized students, who are aware of the learning support but are not able to use it meaningfully. This result suggests that the degree of learner control in CMSs needs to be adapted to students’ regulative capacities. Particularly, students who report a “lack of regulation” need metacognitive and procedural guidance in order to learn to use the available tools in a meaningful way. The current study revealed that a self-reported questionnaire could indicate such students.

Acknowledgment

This research has been made possible due to a Grant from the National Science Foundation-Flanders (FWO) FWO, Grant G.0408.09.

References

  1. A. Y. Chan, K. Chow, and W. Jia, “A Framework for evaluation of learning effectiveness in online courses.,” in Proceedings of the 4th International Conference on Web-based Learning (ICWL '05), R. W. Lau, Q. Li, and R. Cheung, Eds., pp. 383–395, Springer, 2003.
  2. N. Hoic-Bozic, V. Mornar, and I. Boticki, “A blended learning approach to course design and implementation,” IEEE Transactions on Education, vol. 52, no. 1, pp. 19–30, 2009. View at Publisher · View at Google Scholar · View at Scopus
  3. S. R. Malikowski, M. E. Thompson, and J. G. Theis, “A model for research into course management systems: bridging technology and learning theory,” Journal of Educational Computing Research, vol. 36, no. 2, pp. 149–173, 2007. View at Scopus
  4. J. W. Nutta, “Course web sites: are they worth the effort?” NEA Higher Education Advocate, no. 3, pp. 5–8, 2001.
  5. L. Hammoud, S. Love, L. Baldwin, and S. Y. Chen, “Evaluating WebCT use in relation to students' attitude and performance,” International Journal of Information and Communication Technology Education, vol. 4, no. 2, pp. 26–43, 2008. View at Scopus
  6. S. L. Hoskins and J. C. Van Hooff, “Motivation and ability: which students use online learning and what influence does it have on their achievement?” British Journal of Educational Technology, vol. 36, no. 2, pp. 177–192, 2005. View at Publisher · View at Google Scholar · View at Scopus
  7. G. Huon, B. Spehar, P. Adam, and W. Rifkin, “Resource use and academic performance among first year psychology students,” Higher Education, vol. 53, no. 1, pp. 1–27, 2007. View at Publisher · View at Google Scholar · View at Scopus
  8. M. Grabe and K. Christopherson, “Optional student use of online lecture resources: resource preferences, performance and lecture attendance: original article,” Journal of Computer Assisted Learning, vol. 24, no. 1, pp. 1–10, 2008. View at Publisher · View at Google Scholar · View at Scopus
  9. G. Lust, M. Vandewaetere, E. Ceulemans, J. Elen, and G. Clarebout, “Tool-use in a blended undergraduate course: in Search of user profiles,” Computers and Education, vol. 57, no. 3, pp. 2135–2144, 2011. View at Publisher · View at Google Scholar · View at Scopus
  10. D. Perkins, “The fingertip effect: how information-processing technology shapes thinking.,” Educational Researcher, vol. 14, no. 7, pp. 11–17, 1985.
  11. P. H. Winne, “How software technologies can improve research on learning and bolster school reform,” Educational Psychologist, vol. 41, no. 1, pp. 5–17, 2006. View at Publisher · View at Google Scholar · View at Scopus
  12. G. Lust, N. A. Juarez Collazo, J. Elen, and G. Clarebout, “Content Management Systems: enriched learning opportunities for all?” Computers in Human Behavior, vol. 28, no. 3, pp. 795–808, 2012. View at Publisher · View at Google Scholar
  13. F. Marton and R Säljö, “On qualitative differences in learning: outcomes and process,” British Journal of Educational Psychology, vol. 46, pp. 4–11, 1976.
  14. J. D. Vermunt, “Metacognitive, cognitive and affective aspects of learning styles and strategies: a phenomenographic analysis,” Higher Education, vol. 31, no. 1, pp. 25–50, 1996. View at Scopus
  15. J. D. Vermunt, “The regulation of constructive learning processes,” British Journal of Educational Psychology, vol. 68, no. 2, pp. 149–171, 1998. View at Scopus
  16. Y. J. Vermetten, H. G. Lodewijks, and J. D. Vermunt, “Consistency and variability of learning strategies in different university courses,” Higher Education, vol. 37, no. 1, pp. 1–21, 1999. View at Scopus
  17. V. V. Busato, F. J. Prins, J. J. Elshout, and C. Hamaker, “Learning styles: a cross-sectional and longitudinal study in higher education,” British Journal of Educational Psychology, vol. 68, no. 3, pp. 427–441, 1998. View at Scopus
  18. J. Ferla, M. Valcke, and G. Schuyten, “Student models of learning and their impact on study strategies,” Studies in Higher Education, vol. 34, no. 2, pp. 185–202, 2009. View at Publisher · View at Google Scholar · View at Scopus
  19. K. Lonka and S. Lindblom-Ylänne, “Epistemologies, conceptions of learning, and study practices in medicine and psychology,” Higher Education, vol. 31, no. 1, pp. 5–24, 1996. View at Scopus
  20. R. F. A. Wierstra, G. Kanselaar, J. L. Van Der Linden, H. G. L. C. Lodewijks, and J. D. Vermunt, “The impact of the university context on European students' learning approaches and learning environment preferences,” Higher Education, vol. 45, no. 4, pp. 503–523, 2003. View at Publisher · View at Google Scholar · View at Scopus
  21. J. Elen and G. Clarebout, “The use of instructional interventions: lean learning environments as a solution for a design problem,” in Handling Complexity in Learning Environments: Theory and Research, J. Elen and R. E. Clark, Eds., pp. 185–200, Elsevier, Oxford, UK, 2006.
  22. D. Jamieson-Noel and P. H. Winne, “Exploring students' calibration of self reports about study tactics and achievement,” Contemporary Educational Psychology, vol. 27, no. 4, pp. 551–572, 2002. View at Publisher · View at Google Scholar · View at Scopus
  23. Y. J. Nunnally, Psychometric Theory, McGraw Hill, New York, NY, USA, 1978.
  24. D. Steinley, “Local optima in K-means clustering: What you don’t know may hurt you,” Psychological Methods, vol. 8, no. 3, pp. 294–304, 2003. View at Publisher · View at Google Scholar · View at Scopus
  25. R. Tibshirani, G. Walther, and T. Hastie, “Estimating the number of clusters in a data set via the gap statistic,” Journal of the Royal Statistical Society. Series B, vol. 63, no. 2, pp. 411–423, 2001. View at Scopus
  26. G. Clarebout and J. Elen, “Tool use in computer-based learning environments: towards a research framework,” Computers in Human Behavior, vol. 22, no. 3, pp. 389–411, 2006. View at Publisher · View at Google Scholar · View at Scopus
  27. N. E. Perry and P. H. Winne, “Learning from learning kits: study traces of students' self-regulated engagements with computerized content,” Educational Psychology Review, vol. 18, no. 3, pp. 211–228, 2006. View at Publisher · View at Google Scholar · View at Scopus
  28. P. H. Winne, D. L. Jamieson-Noel, and K. Muis, “Methodological issues and advances in researching tactics, strategies and self-regulated learning,” in Advances in Motivation and Achievement: New Directions in Measures and Methods, P. R. Pintrich and M. L. Maehr, Eds., vol. 12, pp. 121–155, JAJ Press, Greenwich, Conn, USA, 2002.
  29. A. F. Hadwin, P. H. Winne, D. B. Stockley, J. C. Nesbit, and C. Woszczyna, “Context moderates students' self-reports about how they study,” Journal of Educational Psychology, vol. 93, no. 3, pp. 477–487, 2001. View at Publisher · View at Google Scholar · View at Scopus
  30. W. Doyle, “Paradigms for research on teacher effectiveness,” Review of Research in Education, vol. 5, pp. 392–431, 1977.
  31. P. H. Winne and R. W. Marx, “Matching students' cognitive responses to teaching skills,” Journal of Educational Psychology, vol. 72, no. 2, pp. 257–264, 1980. View at Publisher · View at Google Scholar · View at Scopus
  32. P. H. Winne, “Minimizing the black box problem to enhance the validity of theories about instructional effects,” Instructional Science, vol. 11, no. 1, pp. 13–28, 1982. View at Publisher · View at Google Scholar · View at Scopus
  33. P. H. Winne, “Why process-product research cannot explain process-product findings and a proposed remedy: the cognitive mediational paradigm,” Teaching and Teacher Education, vol. 3, no. 4, pp. 333–356, 1987. View at Scopus
  34. J. Lowyck, J. Elen, and G. Clarebout, “Instructional conceptions: analysis from an instructional design perspective,” International Journal of Educational Research, vol. 41, no. 6, pp. 429–444, 2004. View at Publisher · View at Google Scholar · View at Scopus
  35. J. Elen and J. Lowyck, “Instructional metacognitive knowledge: a qualitative study on conceptions of freshmen about instruction,” Journal of Curriculum Studies, vol. 32, no. 3, pp. 421–444, 2000. View at Scopus
  36. J. Lowyck, E. Lehtinen, and J. Elen, “Students' perspectives on learning environments,” International Journal of Educational Research, vol. 41, no. 6, pp. 401–406, 2004. View at Publisher · View at Google Scholar · View at Scopus
  37. T. André, “Does answering higher-level questions while reading facilitate productive learning?” Review of Educational Research, vol. 49, pp. 280–319, 1979.
  38. G. Clarebout and J. Elen, “The complexity of tool use in computer-based learning environments,” Instructional Science, vol. 37, no. 5, pp. 475–486, 2009. View at Publisher · View at Google Scholar · View at Scopus
  39. P. Marek, R. A. Griggs, and A. N. Christopher, “Pedagogical aids in textbooks: do college students' perceptions justify their prevalence?” Teaching of Psychology, vol. 26, no. 1, pp. 11–18, 1999. View at Scopus
  40. F. Collopy, “Biases in retrospective self-reports of time use: an empirical study of computer users,” Management Science, vol. 42, no. 5, pp. 758–767, 1996. View at Scopus
  41. D. Jamieson-Noel and P. H. Winne, “Comparing self-reports to traces of studying behavior as representations of students' studying and achievement,” Zeitschrift für Pädagogische Psychologie, vol. 17, no. 3-4, pp. 159–171, 2003. View at Scopus
  42. D. Dunning, C. Heath, and J. M. Suls, “Flawed self-assessment implications for health, education, and the workplace,” Psychological Science in the Public Interest, Supplement, vol. 5, no. 3, pp. 69–106, 2004. View at Publisher · View at Google Scholar · View at Scopus
  43. J. Lompscher, “Learning strategy research: some results, problems, and prospects.,” in Learning Strategies and Skill learning. Essays in Honour of Nils Sovik, A. Flem and R. Karlsdotteri, Eds., Den Kongelige Norske Videnskabers Selskap, Skrifter 4, pp. 13–32, Tapir Forlag, Trondheim, Norway, 1998.
  44. D. Goforth, “Learner control = decision making + information: a model and meta-analysis,” Journal of Educational Computing Research, vol. 11, no. 1, pp. 1–26, 1994.
  45. M. D. Williams, “Steps towards cognitive achievements.,” Journal of Elementary School Journal, vol. 85, no. 5, pp. 673–693, 1996.