Abstract

If students have a broad spectrum of study skills, learning will likely be positively affected, since they can adapt the way they learn in different situations. Such study skills can be learned in, for example, learning-to-learn courses. Several studies of such courses have been done over the years, but few of these have been carried out in longitudinal naturalistic settings, where the effect has been evaluated over several years in nonexperimental settings. In this paper, we present a novel approach for learning study skills, as a part of a course running over three years. The course starts with a learning-to-learn module, followed by 11 follow-ups that include, among other things, peer discussions about learning strategies with the aim of promoting self-regulated learning. This evaluation shows which study skills the students were most interested in trying, how successful they were in continuing to use the study skills, and which effects the students believed the study skills had after trying them. No significant change was found in how satisfied the students were with their overall study technique immediately after the initial module, but in the long term, 78% of the students believed the course had promoted their ability to analyze and adapt their study habits. We conclude that our approach could be a useful way to get the students to improve their repertoire and use of study skills, and we believe that the students also will improve general self-regulated learning skills.

1. Introduction

Getting a degree from a university requires lots of time and effort from students. A typical 5-year education nominally requires 8000 hours of studying. A common explanation for failing in educational settings is based on the “just-world hypothesis” [1], a cognitive bias according to which “people get what they deserve,” and that the reason for failing a course is that not enough effort was put into studying, and therefore by studying more, the problem will be solved. However, time-on-task is not a sufficient condition for learning, merely a necessary condition, and if lack of time was not the real cause, then providing more time will not help [2]. Indeed, time-on-task can in some cases be directly harmful since it can lead to surface learning strategies [3]. Instead of using time as a measure, productive time is a better measure, described as the factor of the time that a student spends on appropriate learning activities [4], and focusing on increasing productive time is often better than on increasing time-on-task in general (ibid).

A way to increase productive time is to make students learn “study skills.” Hattie et al. [5] showed in a meta-analysis that study skill intervention programs in general work most of the time. Especially, having many study skills and being able to choose the ones suitable for a specific situation has positive outcomes. Their results also support that training should “promote a high degree of learner activities and metacognitive awareness.” However, individual study skills in themselves cannot in general be said to be effective, since what is an effective strategy in one domain can be ineffective in another [6].

An important aspect of many study skill programs or “learning-to-learn” courses is to increase the students’ abilities for self-regulated learning [5, 7]. Self-regulated learning refers to the degree to which individuals can regulate aspects of their thinking, motivation, and behavior during the learning process [8]. It is learning that is guided by metacognition (thinking about one’s thinking), strategic action (planning, monitoring, and evaluating personal progress against a standard), and motivation to learn. There are several studies showing the importance of self-regulated learning for academic achievement (e.g., [9, 10]). Zimmerman [11] states that self-regulated learners use systematic and controllable strategies and acknowledge their responsibility for achieving the learning outcomes.

At KTH, we try to train the students to become self-regulated learners in a program integrating course, in which a study skills module is included. In this article, we describe this module and study if and how the students analyze and adapt their study habits as a result of this course.

This study differs from most other studies about study skills, since these studies are rarely carried out in naturalistic settings, but rather by forcing a specific study skill upon a number of students [12]. In this study, the course was a compulsory course in the educational program, running with the same students over a period of three years without requirements to use any specific study skill, which means it was carried out in a naturalistic setting and that the results can be longitudinally evaluated.

2. Materials and Methods

2.1. Design of the Course, the Study Skills Module, and the Intervention

The Program Integrating Course (PIC) is a metacourse in a program, running over several years (3 in our case), meant to strengthen the program coherence. It aims to show the main themes and progression of the program and enable the students to become more professional in handling their studies. The course consists of reflection seminars, 4 times a year, in small (12–14 students) cross-grade groups with a professor as a mentor. This concept was invented by the first author in 2008 [13] and has been further developed by both authors since then [1418].

PIC has currently been spread to 24 programs at KTH, Linköping and Uppsala University. The courses studied here are given by the authors to Media Technology students (70 students starting each year) and Computer Science and Engineering students (170 students starting each year) at KTH.

A study motivation and study skills module is given in the first semester of the first year of the Computer Science and Engineering program and the Media Technology program. The module consists of the following parts (see also the timeline in Figure 1):(1)The students are instructed to look at at least four of nine short videos, where Björn Liljeqvist, a young specialist in study skills, explains and motivates the use of a number of study skills (https://www.kth.se/social/group/studieteknik/page/bjorns-studieteknik). They are also instructed to read a short book on how to study [19].(2)The students write a reflective text about their own study habits and choose at least one new study skill to try for the next months.(3)The students read each other’s texts within the group.(4)The students in the group meet and discuss the topic and their reflections in a one-hour seminar.(5)About six weeks later, the students write a new text, reflecting on how the attempt to try a new study skill fell out, and discuss this at a new seminar.

After these two first seminars, the Program Integrating Course will continue to bring up questions about how the students are studying in their current courses in every seminar, that is, in twelve seminars during three years. There are also other topics of the course that are related to study habits, for example, the PIC module on the topic procrastination [15, 20]. We therefore hypothesize that PIC will improve the study skills and study habits of the students for all three years.

Hattie et al. [5] broadly classify study skill interventions into three categories: cognitive interventions, metacognitive interventions, and affective interventions. Cognitive interventions are described as “those that focus on developing or enhancing particular task-related skills, such as underlining, note taking, and summarizing.” Metacognitive interventions are described as “those that focus on the self-management of learning, that is, on planning, implementing, and monitoring one’s learning efforts, and on the conditional knowledge of when, where, why, and how to use particular tactics and strategies in their appropriate contexts.” Finally, affective interventions are described as “those that focus on such non-cognitive aspects of learning as motivation and self-concept” (ibid).

We have classified the study skills presented to the students in the study skills module described in this paper into the three categories of Hattie and Biggs (see Table 1).

The reflections carried out after each study period are mainly within the metacognitive category.

2.2. Research Questions

Our aim was to answer the following five research questions:(i)RQ1: Has the Program Integrating Course, and in particular the study skills module, contributed in such a way that the students have analyzed and changed their study habits?(ii)RQ2: Is there any difference between the students’ intention about how they should learn, and the outcome?(iii)RQ3: Which types of problems did the students experience, making it harder to use their chosen study skill?(iv)RQ4: Do the students believe that the study skills module has made them improve their learning?(v)RQ5: To which degree have weekly reminders influenced the outcome?

2.3. Research Method

In the Computer Science and Engineering PIC, 169 students from the first year took part in the study skills module.

At step 2 of the module (i.e., when the students should write the first text, week 1 in Figure 1), we gave the students a preintervention web questionnaire with the following three questions:(i)Are you satisfied with your study technique? (related to RQ1)(ii)After informing yourself and reflecting about study techniques for this seminar, which study skills or study habits do you plan to change? (related to RQ2)(iii)Which three of these six factors do you believe are the greatest risks for your study technique not being as good as you would like for the rest of the fall? (related to RQ3)

At step 5 of the module (i.e., when the students should write their second text, evaluating their new habits, week 8 in Figure 1), we gave the students a postintervention web questionnaire, with the following questions:(i)For each study skill X:(a)Did you plan to try X? (related to RQ2)(b)If so, how much did you, actually, try X? (related to RQ2)(c)If you tried X, what is your perception of the effects on your learning of trying X? (related to RQ4)(ii)Which were the top 3 reasons that your study technique was not as good as you planned? (related to RQ3)(iii)To sum up, are you today satisfied with your study technique? (related to RQ1)

Between steps 4 and 5, half of the students, randomly chosen, got emails reminding them once a week to try the new study skill(s). The first reminder was sent out two weeks after the seminar, and totally three reminders were sent out. In the second questionnaire, we could study the difference in answers between the group that got reminders and the group that did not, and therefore conclude whether reminders influenced the intervention in some way, which should give an answer to RQ5. We do not know to which extent the students in the no-reminders group were reminded by other students.

To get a picture of the impact of the Program Integrating Course as a whole, with respect to study skills, the following question has been given to the students in the questionnaire at the final seminar of the academic year, for several years: “Has PIC promoted you to analyze and adapt your study habits?” (related to RQ1).

3. Results and Analysis

Since one of the intended learning outcomes of the course is to critically analyze and reflect on the structure and performance of the programme and their own study achievements, questionnaires can be made mandatory, which we have made use of for the questionnaires reported in this study. This means that the dropping off is almost zero. In the questionnaires, we usually add a question asking the respondents if they allow their answers to be part of a research study.

The prequestionnaire given before the intervention was answered by 168 of 169 students. The postquestionnaire was answered by 165 of 168 students (one student dropped out from the education during the period of the intervention).

The students had to choose at least one of the study skills to try, but they could choose as many as they liked of the seven specified and one unspecified skills. Figure 2 shows the planned study skills to try at the prequestionnaire, and Table 2 shows which skills that were in fact tried, as stated in the postquestionnaire. The mean number of study skills chosen was 3.9, according to the prequestionnaire, and 3.2 according to the second, which means that the students wanted to try about half of the “offered” study skills, but at the second seminar, many students had forgotten one or more of the chosen study skills. Four skills were forgotten to a higher degree than the other four, namely, taking smart notes at lectures, going through the previous day’s and week’s teaching, planning the upcoming week’s studies, and reading the course literature in three steps. The answer “already doing it” was similar in the two questionnaires, except for maintaining a study diary, in which only 2% of the students claimed to do in the prequestionnaire and 11% claimed to do in the postquestionnaire. Only 13% planned to try some other study skills than the seven specified.

The students who intended to try a certain study skill were asked in the postquestionnaire how often they in fact did try it. The results are presented in Table 3. We can see that three of the changes are harder to do both consistently and more than a few times, namely, preparing before lectures, going through the previous day’s and week’s teaching, and stopping procrastinating. Only 6% or less of the students who planned to do these changes were able to do them consistently for the whole period of six weeks, and less than half of the students tried these changes more than a few times.

The students were asked about how satisfied they were with their study technique before and after the module (Table 4). There was a slight increase in the number of students who were very satisfied with their study technique (13% vs 9%), and a slight decrease in the number of students who were not satisfied with their study technique (26% vs 24%) but the differences were small, and a Mann–Whitney U test shows that the null hypothesis, that there was no significant difference before and after the study module, cannot be rejected ().

The students who tried a new study skill at least a few times were asked how they perceived the effects on the learning of using the skill. The results of these questions are shown in Table 5. We can see that for four of the skills, about a quarter of the students perceived an obvious effect. These skills were preparing before lectures, taking smart notes at lectures, going through the previous day’s and week’s teaching, and maintaining a study diary. For the remaining three skills, as well as for skills not on the list, about half of the students saw an obvious effect. The top three skills were stopping procrastinating, followed by planning the studies the upcoming week, and reading the course literature in three steps. The number of students who did not notice any effect was low (one-fifth or smaller) for every skill except maintaining a study diary, for which one-third did not notice any effect.

When the students chose one or more study skills to try, they had to answer the question Which three of these six factors do you believe are the greatest risks for your study technique not being as good as you would like for the rest of the fall? In the postquestionnaire, about 7 weeks later, they were asked a similar question: Which were the top 3 reasons that your study technique was not as good as you planned? The answers to these questions are summarized in Table 6. Students who did use their study technique as planned did not answer the question at the second questionnaire. One-sixth of the students did not answer the second question. There were also students who only stated one or two reasons. Three-quarters of the students stated three reasons.

In the first two columns of Table 6, the proportion of students mentioning each one of the reasons is presented. The last two columns of the table show a weighed value, where a reason is given 3 points if it is top ranked, 2 points if it is ranked second, and 1 point if it is ranked third. The risk of being disturbed by distractions was mentioned by 9/10 of the students in the prequestionnaire but only by 6/10 in the postquestionnaire. Lack of knowledge was considered a risk by 4/10 in the prequestionnaire but only by 2/10 in the postquestionnaire. The same decrease for these two reasons is shown when considering the points. On the other hand, when considering the points, the reason did not manage to hold on for the whole period was higher ranked afterwards than beforehand. In retrospect, all reasons were given about the same number of points; that is, their risks are about the same, except for lack of knowledge, which had a considerably lower ranking.

Half of the students received three email reminders, and half of the students did not receive any email reminders. Our hypothesis was that the reminders should make it easier for the students to remember to use the chosen study skill(s); hence, the null hypothesis was that there was no difference between the conditions. However, the differences between the answers of the two groups were small (Table 7). When adding the answers for all eight study skills for each group, the proportion of students who “almost always” used the study skill was 16% for students getting reminders and 11% for students not getting reminders, but the proportion of students who did not use the study skill at all was the same for the two groups (about 14%). A Mann–Whitney U test analysis shows that there was no significant difference between the conditions (), so the null hypothesis cannot be rejected.

Let us now switch to the questionnaires that have been given to the students of all three years at the end of each academic year. Table 8 summarizes the results of the question “Has PIC promoted you to analyze and adapt your study habits?” from four consecutive years. In this way, we can see how the students experience the effect of the Program Integrating Course with respect to study habits and how the experience changes over the years. In the question, no time span was given, telling whether only the last year of the course or the whole course should be considered. Since a larger proportion (and also a larger absolute number) of students answered No to the question in the second and third years, this likely means that some students in the years two and three only consider the last year in the question, or do not remember how the course did promote their study habits in the beginning of the course. The proportion of students agreeing that PIC has promoted their ability to analyze and adapt their study habits at least to some degree was about 85% in year 1 and over 80% in year 2. For a substantial part of the students (between one-fifth and one-fourth) in the first year, the course has had a high impact.

4. Discussion and Conclusions

The first research question was whether the Program Integrating Course and in particular the study skills module contributed in such a way that the students have analyzed and changed their study habits. As shown earlier in Table 4, the immediate effect of the study skills module did not significantly change the students’ own perception of how satisfied they were with their study skills. However, as seen in Table 3, 46% of the students always or pretty often used the study skills they tried out, and as seen in Table 5, most of the students believed most of the study skills presented were effective. The long-term effect of the entire three-year experience, as shown in Table 8, appears to be substantial, with only 22% of the students answering “no” or “don’t know” regarding whether the PIC course had promoted them to analyze and adapt their study habits.

The second research question was whether there is any difference between the students’ intention and outcome? The results showed there was a clear difference between the students’ intentions to change their study habits and the outcome, with about 54% of the students using the techniques they intended just a few times or not at all.

The third research question was to determine which types of problems the students experienced, making it harder to use their chosen study skill. The results in Table 6 show that the students rated distractions as the top reason for not using such a good study technique as they have planned, followed by lack of time, persistence, and not managing to even start trying. Lack of methodological knowledge decreased significantly after the module compared to before the module, but even before the module, it was the lowest rated of the alternatives. All in all, this indicates that general lack of self-regulatory capability is most important, rather than insights into what needs to be done, and that a module such as the one described here can give new ideas on how to deal with inefficient study habits.

The fourth research question was whether the students think that the study skills module has made them improve their learning. The results are pointing in different directions. As discussed in the results section and seen in Table 4, there was not a significant increase in how satisfied the students were with their study technique after the initial module. However, about 86% of the attempts to try a new skill resulted in the students trying at least a few times and about 43% trying the study skills pretty often or always, as seen in Table 3. Of these 86% of the students, the self-perceived effect on student learning was very high, since 35% thought there was an obvious effect on learning and 51% thought there most likely was an effect on learning. A simple multiplication of these numbers, indicates that ¾ of the students perceived an obvious or likely positive effect.

The fifth and final research question was whether weekly email reminders influenced the outcome. No significant difference could be detected between the two groups. According to research on behavior change [21], triggers or cues are an important factor when aiming for behavior change such as this. However, these triggers were strictly time-based and were viewed by the students when they checked their email, which might not be the same time when they have intentions to study. As shown in the previous studies [22, 23], the timing of such triggers is very important, which could explain the lack of significant difference between the groups, so we suggest further research to be done in that area where triggers are triggered at more individualized moments.

Stopping procrastinating is the skill the students after the intervention believe has most effect on their learning (59% clear effect, 31% probable effect, and 10% no noticeable effect) and was also one of the two skills that most students planned to try (stated by 65% before and 60% afterwards). However, it was also one of the skills that was hardest to use consistently. In order to help the students to become more skilled in not procrastinating, we have a separate topic later in PIC, only concentrating on procrastination, where the students should try some antiprocrastination habit between two seminars and reflect on it afterwards, in the same manner as the study skill activity described above.

Note that the students answering the question summarized in Table 8 had had a study motivation and study skill seminar in the beginning of the course, but without the task to try at least one new study skill and reflect on it afterwards.

Before we introduced the Program Integrating Course with its study skills and study motivation module, the only study skills activity in the program was a simple lecture on study skills, and according to our experiences, this is a quite common situation. We would recommend changing this lecture to a study skills module as described in this article, where the students are told to try, evaluate, and reflect on new study skills. And furthermore, to, similarly to the Program Integrating Course, repeatedly reflect on study skills together with other students, preferably from different grades, throughout the whole education. We believe that the Program Integrating Course approach will improve the self-regulated learning skills in general.

Data Availability

The survey data used to support the findings of this study are available from the second author upon request.

Conflicts of Interest

The authors are course coordinators of the Program Integrating Courses of the Media Technology program and Computer Science and Engineering program, respectively.

Acknowledgments

This research was funded by the KTH Royal Institute of Technology.