New Perspectives on Integrating Self-Regulated Learning at SchoolView this Special Issue
Development and Evaluation of a Computer-Based Learning Environment for Teachers: Assessment of Learning Strategies in Learning Journals
Training teachers to assess important components of self-regulated learning such as learning strategies is an important, yet somewhat neglected, aspect of the integration of self-regulated learning at school. Learning journals can be used to assess learning strategies in line with cyclical process models of self-regulated learning, allowing for rich formative feedback. Against this background, we developed a computer-based learning environment (CBLE) that trains teachers to assess learning strategies with learning journals. The contents of the CBLE and its instructional design were derived from theory. The CBLE was further shaped by research in a design-based manner. Finally, in two evaluation studies, student teachers (; ) worked with the CBLE. We analyzed satisfaction, interest, usability, and assessment skills. Additionally, in evaluation study 2, effects of an experimental variation on motivation and assessment skills were tested. We found high satisfaction, interest, and good usability, as well as satisfying assessment skills, after working with the CBLE. Results show that teachers can be trained to assess learning strategies in learning journals. The developed CBLE offers new perspectives on how to support teachers in fostering learning strategies as central component of effective self-regulated learning at school.
This paper describes the development of a computer-based learning environment (CBLE) that aims to support teachers in promoting learning strategies as a central component of self-regulated learning. More specifically, the CBLE aims to train teachers to assess learning strategies with an instrument suitable for the school context, namely, the learning journal. The CBLE was based on theoretical principles derived from the self-regulated learning literature on the one hand (with respect to the learning contents) and multimedia design on the other hand (with respect to the instructional design). In line with design-based research principles, we carried out formative research to test and refine the design of our CBLE [1, 2].
New perspectives on fostering self-regulated learning in school are provided in this paper in two respects. First, training teachers to assess important components of self-regulated learning such as learning strategies is an important, yet somewhat neglected, aspect of the integration of self-regulated learning at school. Actually, teachers need skills to assess the processes of learning [3–6] in order to foster students’ strategies for learning, for example, by giving helpful specific feedback [7, 8]. Thus, we developed, revised, and tested a CBLE that aims to enhance teachers’ skills to assess learning strategies in school.
Second, we used learning journals as method to foster and assess cognitive and metacognitive learning strategies. Recent models of self-regulated learning look at self-regulated learning as a process embedded in its ecological setting . Learning journals, embedded in naturally occurring events in the classroom, are an ecologically valid way of assessing students’ learning and self-regulated learning . Teachers can use learning journals to assess self-regulated learning in a way that allows for rich formative feedback. Thus, in the CBLE, we introduced the assessment of self-regulated learning using learning journals. In the following, we describe the conceptual framework of self-regulated learning underlying learning journals as assessment for learning strategies and then deduce the training contents of the CBLE.
2. Conceptual Framework and Contents of the Computer-Based Learning Environment: Assessment of Learning Strategies in Learning Journals
In self-regulated learning, students control and influence their own learning processes towards their learning goals. Cyclical interactive process models [10, 11] presume that self-regulated learning proceeds in several phases. The learners define their learning goal in a forethought phase (phase 1), apply powerful cognitive strategies in a performance phase (phase 2), and monitor their understanding in a self-evaluation phase (phase 3) in order to identify and eliminate possible comprehension problems or adapt their cognitive strategies to better approach their goals. Planning adapted cognitive strategies can be already seen as the transition into the forethought phase of a next cycle . Although different models of self-regulated learning emphasize different components (e.g., motivation or overarching goals derived from students’ self [12, 13]), the components most directly related to knowledge acquisition and understanding are cognitive strategies and metacognitive strategies . Metacognitive strategies are used by learners to monitor and controll their cognitive strategies. Cognitive and metacognitive strategies are focussed in the present work. Focusing on strategies that contribute directly to understanding and retention, we can use the broad categories of Weinstein and Mayer’s  frequently cited taxonomy: rehearsal, organization, elaboration, and metacognitive learning strategies (on the usefulness of these categories, see also ).
Recent efforts to measure self-regulated learning based on process models [5, 10, 16–18] use instruments that try to capture (self-regulated) learning processes close to the learning behavior or as they happen (online measures; see , or event measures; see [5, 16]). One such instrument is the learning journal. It can well be used in school contexts [9, 19, 20]. The task of writing learning journal can be assigned to all students of a classroom as follow-up course work in order to reflect on the learning contents and on their own understanding. The learning journal can be introduced to students in a way that lets them follow a cyclical-interactive process of self-regulated learning, focusing on cognitive and metacognitive learning strategies (see Figure 1, for students’ examples; [7, 9, 19–22]). Students are encouraged to plan what learning contents they need to address in writing (forethought phase), to apply cognitive strategies in order to deepen understanding and retention (performance phase), and to monitor their understanding (self-evaluation phase) as well as remediate their learning if necessary (transition into the next cycle).
For example, students choose a topic they need to understand better (metacognitive planning). Then, students might simply write down (rehearse) some information about that topic (see Figure 1, ). On the basis of rehearsed information, students can identify main ideas, relations, or hierarchies within the new learning contents (organization strategies), for example, by highlighting or color coding them, or drawing a map that includes the main concepts in their learning journal. Furthermore, they can link new learning contents to prior knowledge or experiences (elaboration strategies); for example, they explain new concepts with concrete examples. Organization and elaboration particularly deepen understanding. Rehearsal, organization, and elaboration strategies are summarized as cognitive learning strategies . Metacognitive strategies get visible if students write down where they have difficulties in understanding (metacognitive monitoring) and how they plan to overcome difficulties in understanding (transition in a further cycle).
Learning strategies are “materialized” in learning journals (i.e., written down) as they happen (“online” process). Teachers thus can assess those strategies and give specific feedback if they know how to differentiate categories (types) of learning strategies. Teachers should be able to identify types (categories) of learning strategies because they have different functions in understanding and retention ([14, 23], e.g., elaboration integrates new information in prior knowledge structures, thereby creating associations; organization sorts out important from unimportant contents). Thus, in a conceptualization of the CBLE, a section with four interactive learning modules about rehearsal, organization, elaboration, and metacognitive learning strategies was planned. This section should teach in detail which learning strategies should be applied in writing learning journals according to the theoretical and empirical backgrounds, how they improve comprehension, and how to identify and evaluate them in learning journals. Teachers should learn to categorize learning strategies as the broader main categories (rehearsal, organization, etc.) and to identify subcategories as belonging to the broader categories. For example, underlining, highlighting, and summarizing main points of a topic belong to organization strategies. Planning (setting goals), monitoring understanding, and planning remedial strategies belong to metacognitive strategies [12, 14].
Cognitive and metacognitive strategies are closely related to learning outcomes, especially if measured close to the learning behavior [24, 25]. For an assessment of self-regulated learning related to learning outcomes, a teacher could just sum up the number of one type of these described strategies (quantity) in a learning journal entry. However, it is crucial that students not only use learning strategies but that they use the learning strategies appropriately, so that they actually fulfill their specific function in a strong way [9, 24]. For example, a high quantity of note-taking can mean that students are not selective in recording information . Only if students select the most important (instead of marginal) information for their notes, they apply this organization strategy so that it strongly fulfills its function. Such a strategy application is of high quality. Schwamborn et al.  found correlations from .57 to .82 between the quality of an elaboration strategy (learner-generated drawings) and learning outcomes. Glogger et al.  found that both the quantity and quality of learning strategies correlated significantly with learning outcomes. Therefore, the CBLE was planned to train teachers in differentiating quantity and quality of learning strategies so that they can assess and foster learning strategies referring to (a) how often a student applied learning strategies (quantity) and also (b) how well the learning strategies were applied (quality).
The training to identify different learning strategies and to evaluate their quality in learning journals was seen as the core aspect of the CBLE. If teachers hold well-organized and networked schemata about general categories of learning strategies (e.g., elaboration, organization; ) and related subcategories (e.g., creating examples as elaboration strategy), linked to related instances of those categories in learning journals they should be able to apply them to assess students’ learning strategies in learning journals [26, 27].
However, we assumed that teachers additionally need (a) to be informed about the idea of writing learning journals, and (b) to be supported in summarizing data, for example, to create diagrams to get an overview of a class. In order to get qualitative information about needed contents and functionalities, we conducted interviews with teachers.
3. Interviews about Teachers’ Needs
With a small sample of five teachers, we conducted half-structured interviews. We were interested in the teachers’ prior knowledge and understanding about learning strategies and learning journals, and which (planned) contents and functionalities the teachers perceived as helpful and necessary for their work. This information was supposed to help us select contents and functionalities of the CBLE in line with teachers’ needs. Five in-service teachers (3 females and 2 males) from several secondary schools were interviewed (3 teachers of lower track classrooms within the German three-track system, grades 5 to 9; 2 teachers of middle-track classrooms, grades 5 to 10). We first explained to the teachers how we define learning strategies, and learning journals, and we explained the basic goal of the CBLE. We then asked open questions about learning journals, learning strategies and how teachers have assessed them in school so far. After that, we asked teachers what features and functionalities they would prefer in the CBLE in one open question and in more closed questions about specific features and functionalities. The specific features and functionalities had been conceptualized in the research team including one in-service teacher. The questions asked whether teachers would use this functionality on a 6-point scale (1 = no, never; 6 = yes, always) and what perceived (dis-)advantages of a specific functionality motivate their answer.
4. Results and Consequences for the Design of the Computer-Based Learning Environment
The half-structured interviews first revealed one surprising result. Teachers’ conceptions of learning strategies can differ profoundly from the scientific conception (e.g., a teacher mentioned a kind of problem-solving strategy instead of strategies which are applied in order to understand and learn: “I use brainteasers, which are also about strategies”). In addition, teachers lacked ideas of how they can assess learning strategies in general (e.g., “Good question, I think I have never really consciously assessed learning strategies”). The learning environment thus has to start at a quite basic point: define what learning strategies are and explain how learning journals can be implemented in classroom teaching as medium to foster and assess learning strategies (self-regulated learning). On this background, the first section of the preliminary CBLE was created. In Section 1, teachers learn how learning strategies are defined, what learning journals are and how the application of learning strategies can be encouraged during journal writing. Instructional goals and potential positive effects of the application of learning journals are outlined. These contents are presented by a professional speaker with slides illustrating the verbal information.
According to the teachers themselves, besides a brief introduction in the application of learning journals and the categorization of learning strategies, a collection of materials for in-classroom use of the learning journal method should be given. The material should allow for effectively introducing the students into journal writing and the application of learning strategies while writing. Thus, materials from previous studies  were slightly adapted for teachers and made available in a second section of the CBLE. That is, in Section 2, teachers find material of how to implement writing learning journals in classrooms. The materials’ implementation is explained briefly.
In order to evaluate the written learning journals, teachers suggested a learning module introducing a category system of learning strategies. They stressed the importance of (real) students’ learning journal examples. These suggestions could well be aligned to the theoretically deduced conception of four learning modules about rehearsal, organization, elaboration, and metacognitive strategies (cf. ; see the paragraph about the instructional design of the CBLE). These four modules constitute Section 3 of the CBLE.
When teachers apply the category system to assess their students’ learning journals, this assessment should not take longer than about 10–15 minutes per student. All teachers stated they would (always or often) use a database functionality to administer data by classroom and individual student (over time) and a functionality that allows for displaying a graphical overview of the results of a whole classroom. All five teachers emphasized that they would appreciate readily available feedback suggestions for fostering strategy application in students. These features were thus selected and provided by a forth section of the CBLE. Section 4 of the CBLE, a cognitive tool, provides a learning journal management tool which is thought to help teachers to evaluate hand-written learning journals in a time-efficient way, administer student data, and formulate feedback to students. A simplified rating catalogue was developed so that the assessment of one student’s learning journal only takes about 10–15 minutes. Section 3 was supposed to train teachers in categorizing learning strategies in learning journal examples and apply quality criteria to the examples. With this background knowledge, the rating catalogue of Section 4 can not only be used to assess the learning strategies in a time-efficient way, but also in a way that is well grounded in theory. Section 4 also helps to analyze and compare students’ use of learning strategies within or across courses by displaying graphical overviews of the results of whole classrooms. Finally, it helps teachers to formulate supportive formative feedback for students. Text blocks are given as feedback suggestions, (e.g., “You have linked new learning contents with something familiar by an example from our lesson. It would be great for your own learning if you tried to find an own example next time. If you think about the learning content intensively by finding own examples, you will be able to memorize it better”).
A functionality that the teachers would not or rarely use would be to exchange ideas or pose questions online about strategy assessment or the application of the program. Therefore, this (initially planned) feature was not realized. In addition, teachers preferred an offline version over a version that could be accessed online because not all teachers would have the possibility to work with constant internet connection, and many colleagues would be afraid about the security of the students’ data. Teachers do not have their own personal computer work station at school. Therefore, the CBLE was developed as an offline version based on Adobe Flash CS2. Flash made the CBLE platform independent; that is, it potentially runs with different systems (Mac, Linux, and Windows) without needing an installation. If teachers or student teachers need to use the CBLE on computers in school or university, they do not need administration rights, because an installation is not needed.
Altogether, the interviews gave us the following important hints: introduce contents at a quite basic level, give ready-to-use material for introducing students, give a lot of examples, and provide a way to assess learning strategies and give feedback in a time-efficient way.
5. Summary of Contents of the Computer-Based Learning Environment: Assessment of Learning Strategies in Learning Journals
In summary, in line with the interview results, the first preliminary CBLE had four main sections (Figure 2): (1) an introductory video about learning journals as assessment method for learning strategies, (2) materials for introducing the students in writing learning journals (e.g., PowerPoint presentations and example lesson-plans), (3) four interactive learning modules about rehearsal, organization, elaboration, and metacognitive learning strategies, and (4) a cognitive tool, that is, a database-driven learning journal management tool with a simplified rating scheme for the fast assessment of learning journals (“assessment tool” in Figure 2). In the following, we will report how we considered important multimedia principles and approved instructional support procedures in the design of the CBLE.
6. Applying Multimedia Principles and Approved Instructional Support Procedures
We considered important design principles for multimedia learning when developing the CBLE [28–30]. These principles were especially important for the development of the core part of the CBLE, the interactive learning modules about rehearsal, organization, elaboration, and metacognitive learning strategies (Section 3; see Figure 3). This part aimed to teach (prospective) teachers well-organized, networked schemas, linked to concrete examples, which should enable them to apply this knowledge to students’ learning journals [26, 27].
6.1. Multimedia Principles
The underlying software Flash allowed us to realize Mayer’s  multimedia principle by including pictures, sound files, and videos and by creating animations. According to the modality principle, we combined visualizing elements and spoken text. An example of the modality principle is shown in Figure 4. Another example is a student’s visualization that is presented with a spoken explanation of how this visualization is an elaboration strategy.
In addition, we aimed to present pictures and texts that belong together in spatial or temporal contiguity (contiguity principle). We avoided presenting irrelevant information such as decorative illustrations or marginal information in texts (coherence principle). We also avoided presenting redundant information simultaneously (redundancy principle). We realized the personalization principle by addressing learners directly in spoken and written texts.
Several functionalities address the control-of-processing principle. Whenever there is spoken text, learners can pause and/or start the sound by clicking on the play button (Figures 5 and 1). They can navigate freely within the audio file, that is, jump to a specific time of the audio file by clicking on the time line (2) too. These options allow some learner control as formulated by Schnotz . Learners can control the speed of processing the given information. To avoid the “fleetingness” of acoustic explanations, learners can view the textual version of the spoken text (by clicking on “show audio as text” (4) in Figure 5).
6.2. Approved Instructional Support Procedures
6.2.1. Transition Principle of Instructional Support
According to this principle formulated by Hilbert et al. , each strategy module starts with an expert’s explanation (direct instruction) about the according strategy (rehearsal, elaboration, organization, or metacognition). Then, students’ journal examples are presented, and an explanation of the expert can be clicked and started. The expert explains how the journal passage exemplifies the explained learning strategy category. Partly, such passages consist of contrasting examples from students’ learning journals (see Section 6.2.2). Using a lot of real students’ examples (adapted from the  studies) was in line with interviewed teachers’ suggestions. Each module ends with tasks that ask the learners to self-explain why journal example passages represent one kind of learning strategy.
6.2.2. The Contrasting Examples Principle
Comparing two or more instances of a subject is a successful method of making underlying differentiation principles salient [32–34]. This principle was used when differentiating different kinds of strategies within one strategy module (e.g., different elaboration strategies are contrasted in examples). We also applied the contrasting examples principle in the quality part of each module. Each of the four strategy modules ends with a part about the quality aspect of the learning strategy. In order to sensitize for the underlying criteria of evaluating the quality of the strategy, we contrasted examples that differ in their level to which the strategy’s specific function is fulfilled (i.e., their quality).
6.3. Interactive Learning Tasks
The computer-based learning environment was developed for teachers to work on their own, without further support by a trainer or by peer learners. Therefore, all the learning activities can be accomplished without further instructional support or additional material. Basically, the following two kinds of learning tasks were implemented.
A learning journal passage with one realized learning strategy was given with the right categorization. The learners were prompted to explain why this learning strategy can be categorized in this way (e.g., “This is an elaboration strategy. Why?”) .
6.3.2. Practice with Classification Tasks
In order to train differentiating different levels of quality (high, middle, or low) at the end of the quality sub-modules, teachers had to classify journal passages of different quality by dragging and dropping them to three quality levels (see the boxes in Figure 6: (1) high, (2) middle, or (3) low quality; see also contrasting example principle).
6.3.3. Elaborated Feedback
After completing (all) the classifications of one task or after choosing one of the strategy categories, feedback on the correctness is provided, and learners can read an explanation for why an answer is correct or incorrect (cf. ).
7. Formative Evaluation of the Computer-Based Learning Environment
Formative evaluation of preliminary versions of the CBLE was conducted by (1) asking experts for feedback and (2) walkthroughs by the research team whenever a new version was finished. Additionally, in line with design-based research principles, we used the CBLE in a small student teacher class and asked for feedback (3). Finally, two evaluation studies were conducted.
7.1. Expert Sessions
Altogether, six experts worked with a prototype including the module about organization strategies. We asked for written feedback about usability aspects, comprehensibility of the contents, and multimedia principles before we collected open oral feedback. Important refinements in reaction to the expert sessions were (a) adding extra sites with overviews or with short introductions in order to give the learner more orientation within learning contents; (b) refinements referring to multimedia principles; for example, when graphics (journal passages or “illustration” of explanations) included too much text, experts recommended to avoid simultaneous audio. Thus, in the case that journal passages contain primarily textual information, we prompted learners to first read the journal example carefully and then activate the spoken instructional explanations by clicking a button.
7.2. Research Team
The research team consisted of the authors (including one in-service teacher who also worked in teacher education) and five student research assistants who either studied educational psychology, instructional design, or teaching. Within the research team, we continuously conducted cognitive walkthroughs, documented feedback and systematically incorporated it into the software.Important refinements comprised highlights in graphics in order to guide attention, and also highlighting navigation in order to show the learner where s/he is and what else is coming. Refinements that were done after almost each feedback cycle concerned coherence and comprehensibility of texts (e.g., optimizing wording; using the same words for a concept throughout a module; leaving off irrelevant information).
7.3. Student Teacher Session
Then, we evaluated the prototypical CBLE with a small teacher education class of nine student teachers. They answered questions about motivation and satisfaction positively on average. Again, we asked for written feedback about usability aspects, comprehensibility of the contents, and selected multimedia principles before we collected open oral feedback. Important refinements in reaction to the student teacher session were (a) adjustment of tasks (e.g., some self-explanation tasks had to be more difficult; feedback in classification tasks should not be given until the learner asks for it by clicking a button); (b) more text was changed to audio; (c) texts were structured by more underlining or highlighting of important aspects; (d) the orientation in the learning environment, partly being a matter of usability, still had to be enhanced. Therefore, we added the figure of a pilot to support the navigation and orientation in the interactive learning environment. When a learner enters a new part of the learning environment (or a new kind of task), the pilot describes which activities are expected from or offered to the learner. It is explained, if applicable, how specific functionalities of the program can be used. On the left, there is a pilot button where this information about the specific part of the program can be recalled (Figures 3, 4, and 6, at the left margin, “Lotse”).
The latter refinement prompted us to also insert an expert button. The expert button (above the pilot button) allows to reread instructional explanations about every learning strategy (rehearsal, elaboration, organization, and metacognition), no matter in which module the learner is.Finally, the formative evaluations and refinements ended in the version of the CBLE that was used in the evaluation studies described in the next paragraphs.
8. Evaluation Study I of the Computer-Based Learning Environment
The aim of this evaluation study was to test whether (prospective) teachers learn to assess learning strategies in learning journals efficiently and effectively by working with the CBLE. Additionally, we regarded a high acceptance of the CBLE as favorable precondition for its use in later “real” practice. The CBLE should be attractive so that (preservice) teachers actually put time and effort in working with the environment [36–38]. Therefore, the study tested whether (prospective) teachers are interested in the learning content, motivated while working with the environment, evaluate the contents as comprehensible, find the environment easy to use (usability), and, in the end, learn to assess learning strategies in authentic learning journal passages.
Forty-four German student teachers (25 females and 19 males; age: ) from a German University participated in this laboratory study. They worked through one learning module of the CBLE that introduced several subcategories of organizational strategies (20 minutes). They were then asked to rate their satisfaction with the CBLE (e.g., “I would recommend this learning environment”), their interest in the contents (e.g., “I find it important to learn about a scheme for assessing learning strategies”), the usability (e.g., “Operating the learning environment was easy”), and their motivation to keep on working in the CBLE (“I would like to work through the other parts of the learning environment (assessment of rehearsal, elaboration, and metacognitive learning strategies)”). All items were rated on a 7-point scale (1: not at all true; 7: absolutely true). A posttest with 14 items including authentic learning journal passages assessed participants’ assessment skills. They had to identify learning strategies, to evaluate the quality of applied learning strategies, and to provide reasons for their judgments. One third of the posttests was rated by two independent raters using a 6-point scale for each item (6: all central aspects, coherently related to each other). As interrater reliability was very good (), the remaining tests were only rated by one rater.
We found high satisfaction (; scale from 1 to 7) and high topic-specific interest (). For example, the student teachers rated the item “I find it important to learn about a scheme for assessing learning strategies” on the 7-point scale (7: absolutely true) at 6.11 () on average. That is, student teachers perceived the learning contents of the CBLE were very important.
Results regarding the usability of the CBLE showed that student teachers did not report difficulties using the CBLE (). Student teachers perceived the environment as, for example, not unnecessarily complex and its structure understandable.
Additionally, motivation to keep on working in the CBLE was rated 6.55 on average (). Assessment skills after the short learning phase were satisfying. On average, student teachers achieved half of the maximum score on most items (overall score on the scale from 1 to 6: ). That is, after 20 minutes of learning, they were able to identify and evaluate learning strategies in authentic learning journal passages that they had not seen before.
Important refinements in the CBLE in reaction to Study 1 were (a) adding or changing keywords or symbols that communicated to the learners how or where they can close a window (e.g., when working in a specific submodule, the whole window has to be closed in order to get back to the overview site. In order to make that clear, we replaced “close window” by “back to overview”); (b) adding saving functions (e.g., answers to a task were saved for the user as well as which modules had been finished). Results suggested that the theoretically deduced contents and the instructional design of the module could serve as model for the other modules. Thus, we finished a second module elaboration which was studied in the second evaluation study.
9. Evaluation Study II in Student Teacher Courses
In this evaluation study, we had a twofold research question. Student teachers worked with and evaluated our CBLE during their School of Education courses. We were again interested in their satisfaction with the CBLE, interest in the presented contents, motivation to keep on working in the CBLE, usability, and learning outcomes. In addition, we tested a simple variation which can be realistic for in-service teachers and helpful for teacher education. We expected that the mere exposure to an authentic problem prior to working with the CBLE (without attempts to solve it) enhances motivation and learning outcomes. The authentic problem consisted of passages from students’ learning journals. Student teachers were asked to give those students feedback after learning how to evaluate learning strategies in learning journals with the CBLE.
Approaches that try to optimize receptive forms of learning and direct forms of instruction by prior problem-oriented activities were the theoretical background for this variation (e.g., [39, 40]). Such approaches typically use generative group activities that are quite time consuming. These activities are thought to generate forms of prior knowledge that allow for subsequent elaborative processes. Simply knowing about target problems might, however, already sufficiently prepare for learning from direct instruction, as the goal of learning, knowledge deficits, and the application context become salient.
Eighty-nine student teachers (71 females and 18 males) from two equivalent seminars at a German School of Education participated in the study (title of the seminars: “Selected topics of mathematics instruction”). Two weeks prior to the intervention, they all worked on a pretest during their seminar. The pretest measured assessment skills regarding learning strategies by two open questions about learning strategies and how student teachers would identify and evaluate them in school.
At the intervention, participants were randomly assigned to two conditions: (a) they either received the authentic problem prior to learning with the CBLE and had 5 minutes to familiarize themselves with the problem (problem-first group, ); or (b) they received the same problem after learning (problem-after group, ). The problem consisted of learning journal passages and a description of a situation (“Here you see two learning journal passages of two students. Your challenge is the identification of learning strategies and the formulation of a feedback to the students. Use the information from the learning environment.”). All participants learned in the module of the CBLE that introduces the assessment of elaboration strategies for 30 minutes. Providing feedback was not specifically taught.
After learning, all participants answered the items on satisfaction with the CBLE, their interest in the contents, the usability, and their motivation to keep on working in the CBLE. The items were the same as in the first evaluation study (see Study 1 for examples). Then, the problem-first group had 10 minutes to solve the problem, that is, to identify learning strategies in the learning journals and to write feedback to the students. The problem-after group now received the problem, read it carefully (5 minutes), and worked on it for 10 minutes; that is, time-on-task was equal across groups. The posttest contained three items that measured assessment skills. Student teachers had to evaluate the quality of elaboration strategies in learning journal passages as low, middle, or high and to explain their assessment by completing the sentence “I evaluate the quality of the elaboration strategy in passage 1 as … [low, middle, high], because…” The items were rated on a 6-point scale. Transfer was measured by coding student teachers’ feedback on two learning journal passages. Scores were (1) the number of sections coded as “learning strategy use”, where student teachers pointed out which learning strategies the student uses in the learning journal and in which quality (“You found an own example which you explained very nicely and in detail”); (2) the number of sections coded as “suggestions for improving learning strategies” (“Next time, you could improve your example by additionally drawing an illustration of the example problem”). Two independent raters rated and coded the test items (ICCs > 0.80).
9.2. Results and Discussion
With regard to the satisfaction about the CBLE, we again found high values (; scale from 1 to 7). The motivation to work with the CBLE and the interest in the learning contents was very high (). For example, the student teachers rated the item “I find it important to have learned about a scheme for assessing learning strategies” on a scale from 1 to 7 (as in Study 1, i.e., 7: absolutely true) at 6.41 () on average. The item “I would like to work through the other parts of the learning environment (assessment of rehearsal, organization, and metacognitive learning strategies)” was rated at 6.35 () on average.
Results regarding the usability of the CBLE showed that student teachers find the CBLE easy to use (). Most items were rated at the second highest point on the scale from 1 to 7 (7: absolutely true). Thus, student teachers perceived the learning environment as, for example, easy to operate () and its structure understandable ().
Assessment skills after the short learning phase were again satisfying. On average, student teachers achieved at least half of the maximum score on all items (overall score: ). That is, they were able to efficiently learn and transfer their acquired knowledge on learning strategies to the evaluation of (new) authentic journal passages.
With regard to the experimental variation, we found that student teachers who first familiarized themselves to the authentic problem achieved higher scores in the assessment skill measure (), as well as in the feedback transfer score “learning strategy use” (). Suggestions, however, were made more often by the problem-after group (). This is surprising and can be seen as problematic, because student teachers told students how to improve their learning strategy use even though they were less successful than the problem-first group in identifying the students’ learning strategies and explaining why the learning strategies were suboptimal. However, well-grounded, specific, and, thereby, helpful  suggestions can scarcely be made without detailed identification of learning strategies and their quality. Prior knowledge was considered in the previous results and was not significantly related to the learning outcome scores. Motivation did not differ significantly between groups prior or after the learning phase which can be explained by ceiling effects; almost all student teachers were highly motivated (all on the scale from 1 to 7; pre: ; post: ).
In summary, receiving an authentic problem prior to learning prepared student teachers to learn and transfer their knowledge to giving feedback that was based on specific observation. In contrast to research about preparation to learn from direct forms of instruction, it is not necessary to use time-consuming instructional methods. Instead, the mere exposure to a problem taken from their professional context can prepare student teachers to learn from direct instruction. Satisfaction, interest, and usability of the CBLE were high, which is an important precondition for its use in practice. These results served as further evidence that the remaining modules of the CBLE could be finished on basis of the two studied modules.
This paper has introduced a CBLE that, in the end, can support teachers to promote self-regulated learning in school by training their skills to assess learning strategies. We reported on the theoretical deduction of contents as well as of instructional design features, on further shaping the conception of contents and design by teacher interviews, and formative evaluation cycles. The reported evaluation studies suggest that teachers can learn to assess learning strategies in learning journals efficiently and effectively by working with the CBLE. In fact, learning and transfer scores were mostly in the middle of the scales. This achievement could seem moderate; however, two points have to be considered. First, student teachers only worked with one of the four strategy modules for 20–30 minutes. Second, the posttest items all required the application of learned principles to learning journal passages; that is, they required some transfer. Therefore, the participants’ achievements on identifying learning strategies (quantity) and on evaluating the strategies’ quality after such a short learning phase can indeed be regarded as substantial.
Exposing student teachers to an authentic problem prior to learning, that is, expecting them to give feedback on learning strategy use to students after working with the CBLE, enhances the feedback student teachers give. Motivational variables as well as usability were rated very high in both evaluation studies.
Especially, the evaluation Study 2 suggests that the CBLE is of practical use as it was conducted in regular teacher education courses. Results suggest that student teachers would benefit from and enjoy working through the CBLE during regular teacher education courses similar as in the study. Working with the CBLE as homework might also be effective if, for example, authentic problems are discussed in the courses. The high acceptance of our CBLE is a favorable precondition for its use in later practice.
To date, our approach to develop the computer-based learning environment has taken first successful steps. Nevertheless, there are open questions to be addressed in the future. The benefits of our CBLE should be analyzed in more depth by studies that assess the teachers’ learning processes more carefully. Interventions (e.g., a pretraining as in ) prior to learning with the CBLE could be of additional value because we observed mistakes (apparently based on incoherently organized knowledge pieces) in the pretest that were repeated in the posttest. Furthermore, studies on how students’ self-regulated learning is affected by the teachers’ use of the CBLE would be sensible. We will keep on offering courses using the CBLE to (student) teachers. Thus, many (student) teachers will have the opportunity to learn more about assessing learning strategies in learning journals, which can enable them to better promote self-regulated learning in school.
The project “Das Lerntagebuch als Mittel zur formativen Diagnostik von schulischen Lernstrategien” (the learning journal as means for the formative assessment of learning strategies in school) was part of the research program “Bildungsforschung” (educational research) of the Baden-Württemberg Stiftung. The authors would like to thank all teachers and student teachers that participated for their cooperation and Cornelia Go, Elmar Offenwanger, and the research assistants Yana Ampatziadis, Patricia Schirtz, Maren Oppelland, and Julian Roelle for their dedicated collaboration. Special thanks go to Franziska Trischler for professionally narrating and recording the texts.
A. Collins, D. Joseph, and K. Bielaczyc, “Design research: theoretical and methodological issues,” Journal of the Learning Sciences, vol. 13, no. 1, pp. 15–42, 2004.View at: Google Scholar
R. T. Clift, E. S. Ghatala, M. M. Naus, and J. Poole, “Exploring teachers' knowledge of strategic study activity,” Journal of Experimental Education, vol. 58, pp. 253–263, 1990.View at: Google Scholar
P. L. Peterson, “Teachers' and students' cognitional knowledge for classroom teaching and learning,” Educational Researcher, vol. 17, no. 5, pp. 5–14, 1988.View at: Google Scholar
P. H. Winne and N. E. Perry, “Measuring self-regulated learning,” in Handbook of Self-Regulation, M. Boekaerts, Ed., pp. 531–566, Academic Press, Orlando, Fla, USA, 2000.View at: Google Scholar
I. Glogger, R. Schwonke, L. Holzäpfel, M. Nückles, and A. Renkl, “Learning strategies assessed by journal writing: prediction of learning outcomes by quantity, quality, and combinations of learning strategies,” Journal of Educational Psychology, vol. 104, pp. 452–468, 2012.View at: Publisher Site | Google Scholar
P. H. Winne and A. F. Hadwin, “Studying as self-regulated learning,” in Metacognition in Educational Theory and Practice, D. J. Hacker, J. Dunlosky, and A. C. Graesser, Eds., pp. 277–304, Lawrence Erlbaum Associates, Mahwah, NJ, USA, 1998.View at: Google Scholar
B. J. Zimmerman, “Becoming a self-regulated learner: an overview,” Theory into Practice, vol. 41, no. 2, pp. 64–70, 2002.View at: Google Scholar
C. E. Weinstein and R. E. Mayer, “The teaching of learning strategies,” in Handbook of Research in Teaching, C. M. Wittrock, Ed., pp. 315–327, Macmillan, New York, NY, USA, 1986.View at: Google Scholar
B. J. Zimmerman, “Attainment of self-regulation: a social cognitive perspective,” in Handbook of Self-Regulation, M. Boekaerts, P. Pintrich, and M. Zeidner, Eds., pp. 13–39, Academic Press, Orlando, Fla, USA, 2000.View at: Google Scholar
I. Glogger, L. Holzäpfel, R. Schwonke, M. Nückles, and A. Renkl, “Activation of learning strategies in writing learning journals: the specificity of prompts matters,” German Journal of Educational Psychology, vol. 23, pp. 95–104, 2009.View at: Google Scholar
R. E. Mayer, “Learning strategies for making sense out of expository text: the soi model for guiding three cognitive processes in knowledge construction,” Educational Psychology Review, vol. 8, no. 4, pp. 357–371, 1996.View at: Google Scholar
H. Borko and R. T. Putnam, “Learning to teach,” in Handbook of Educational Psychology, D. C. Berliner and R. C. Calfee, Eds., pp. 673–708, Macmillan Library Reference USA, Simon & Schuster Macmillan, New York, NY, USA, 1996.View at: Google Scholar
A. Renkl, “Instruction based on examples,” in Handbook of Research on Learning and Instruction, R. E. Mayer and P. A. Alexander, Eds., pp. 272–295, Routledge, New York, NY, USA, 2011.View at: Google Scholar
R. E. Mayer, Multimedia Learning, Cambridge University Press, New York, NY, USA, 2001.
R. E. Mayer and R. Moreno, “Nine ways to reduce cognitive load in multimedia learning,” Educational Psychologist, vol. 38, no. 1, pp. 43–52, 2003.View at: Google Scholar
W. Schnotz, “An integrated model of text and picture comprehension,” in Cambridge Handbook of Multimedia Learning, R. E. Mayer, Ed., pp. 49–69, Cambridge University Press, Cambridge, UK, 2005.View at: Google Scholar
T. Hilbert, S. Schworm, and A. Renkl, “Learning from worked-out examples: the transition from instructional explanations to self-explanation prompts,” in Instructional Design for Effective and Enjoyable Computer-Supported Learning, P. Gerjets, P. A. Kirschner, J. Elen, and R. Joiner, Eds., pp. 184–192, Knowledge Media Research Center, Tuebingen, Germany, 2004.View at: Google Scholar
J. D. Bransford and D. L. Schwartz, “Rethinking transfer: a simple proposal with multiple implications,” Review of Research in Education, vol. 24, pp. 61–100, 1999.View at: Google Scholar
P. Gerjets, K. Scheiter, and J. Schuh, “Information comparisons in example-based hypermedia environments: supporting learners with processing prompts and an interactive comparison tool,” Educational Technology Research and Development, vol. 56, no. 1, pp. 73–92, 2008.View at: Publisher Site | Google Scholar
T. W. Malone and M. R. Lepper, “Making learning fun: a taxonomy of intrinsic motivations for learning,” in Aptitude, Learning and Instruction III: Conative and Affective Process Analyses, R. E. Snow and M. J. Farr, Eds., pp. 223–253, Lawrence Erlbaum Associates, Hillsdale, NJ, USA, 1987.View at: Google Scholar
H. G. Schmidt, M. L. de Volder, W. S. de Grave, J. H. C. Moust, and V. L. Patel, “Explanatory models in the processing of science text: the role of prior knowledge activation through small-group discussion,” Journal of Educational Psychology, vol. 81, no. 4, pp. 610–619, 1989.View at: Google Scholar
D. L. Schwartz and J. D. Bransford, “A time for telling,” Cognition and Instruction, vol. 16, no. 4, pp. 475–522, 1998.View at: Google Scholar
A. Ohst, I. Glogger, M. Nückles, and A. Renkl, “Knowledge in Pieces einen Rahmen geben? Ein Ansatz zur Optimierung eines Trainings, [Give knowledge in pieces a frame? An approach to optimize a training],” Bielefeld, Germany, September 2012.View at: Google Scholar