Abstract

Present-day university systems need to educate graduates who are confident and highly independent, attributes that are especially relevant to engineering. We need to develop active methods that can analyze the prior knowledge of students and that impart teaching based on self-regulation and self-assessment by the student. In this study, we work with a sample of 116 students of architecture following a Structural Engineering subject module (61 in the experimental group and 55 in the control group). The objectives of the investigation are (1) to test whether significant differences exist in the knowledge of students after a training program in self-regulation and (2) to test whether the use of rubrics will improve the perceptions of students with regard to their own knowledge. We found that students trained in self-regulation methodologies improved their procedural knowledge in the field of structural engineering. Likewise, student self-perceptions of their own knowledge increased in relation to the design and expert assessment of structural elements and the graphic representation of constructive elements.

1. Introduction

The global university system needs to educate confident graduates with a high degree of autonomy in their professional activity [1]. Better learning is an investment of the educational system to train more effective graduates for the global employment market. In particular, over the last decade, the university system has made an effort at a global level to incorporate active methods into the classroom (work in groups, cooperative learning, constructive learning, meaningful learning, etc.) and evaluation methods in which students participate (self-evaluation processes) [2].

Self-regulation is prominent among these active methods [3] and represents a fundamental tool available to the teacher to facilitate successful learning among students. Self-regulation processes mean that it is the students themselves who represent the tasks, plan the steps for their solution, evaluate the processes and the products of their learning, and plan different problem-solving paths where necessary. All these techniques help individuals to learn better, developing more certain and profound learning [4].

A direct relation also appears to exist between the processes of self-regulation, student motivation towards learning, and self-evaluation processes [5]. Recent research [6, 7] has pointed to the importance of feedback from the teacher in the development of student learning. However, some types of feedback are more effective than others [8]. Feedback from the teacher will be more or less effective in accordance with the use that students make of the learning strategies, above all the metacognitive and the motivational strategies, as these types of strategies will increase the levels of learner control over the task to be solved [9]. In addition, for the feedback to be effective [10], teachers should(1)clearly define the task,(2)plan the problem-solving process,(3)analyze the progress of their learners.

Likewise, to develop a good idea of student perceptions of their own learning, the teacher should [11](1)clarify, before starting the teaching process, the competences that the students have to acquire,(2)provide clear criteria to students on successful learning results,(3)relate the prior knowledge and the “meta” knowledge of the assignment,(4)seek to reduce the gap between beliefs on the subject and the objectives of the teaching-learning process,(5)ensure feedback, as this reduces the gap between their beliefs on the subject matter and the objectives of the teaching-learning process.

We may therefore link successful learning directly to processes of self-regulation. Recent research [12, 13] has demonstrated that the skill of precise estimation of one’s own performance, confidence in knowledge, and knowledge on strategies and on problem-solving can be learnt and can therefore be taught. So, we have criteria that define the success of the teacher with links to teacher feedback and to the corrections made to student learning [14]. Likewise, the calibration of student perceptions of their own learning is a key factor in the development of metacognition and self-regulation. Self-regulation will be scant or nonexistent if students have no knowledge of their current state of learning and the levels they wish to reach at the end of the degree course [15].

The teacher should therefore take into account the following factors:(1)an analysis of the students’ reflective processes,(2)an analysis of the prior knowledge held by students before they start the course (the better and the more appropriate the prior learning, the better the learning outcomes; the worse the prior learning, the greater the overconfidence and the worse the learning outcomes),(3)feedback that should improve student self-regulation,(4)definition of the competences or learning goals,(5)structuring of the content and the tasks that will help to improve student confidence in their own learning,(6)the use of oral reinforcement in the processes of continuous evaluation that will provide information on student learning processes [16].

One of the best instruments to enable feedback in self-regulation processes is rubrics [17]. Their use will allow the teacher to obtain indicators that help them to improve their teaching plans and to increase the quality of student learning [1820].

Aims of research were to(1)establish the learning strategies that students use in the Structural Engineering module,(2)test whether significant differences exist between the experimental group and the control group following the self-regulated learning program in conceptual and procedural knowledge on the Structural Engineering module,(3)test whether significant differences exist between the experimental group and the control group following the self-regulated learning in the self-perception of the students with regard to their knowledge learnt on the Structural Engineering module.

2. Methodology

2.1. Participants

We worked with a sample of 116 students studying on the third year of their degree course in Technical Architecture on the Structural Engineering module. Table 1 presents the characteristics of the sample.

2.2. Instruments
2.2.1. Learning Strategies Scales (ACRA) [21]

ACRA measures the use of 32 learning strategies concerning various aspects of information processing. Assessment is through a Likert scale (1 to 4) that is divided into the following: Scale I: acquisition of information (attentional and review strategies: with 20, maximum raw score 80 points, and maximum percentile 99); Scale II: coding (mnemonics, organization, and elaboration: with 46 items, maximum raw score 184 points, and maximum percentile 99); Scale III: retrieval (seeking and generating a response: with 18 items, maximum raw score 72 points, and maximum percentile 99); and Scale IV: metacognitive strategies. The scale distinguishes two subscales: metacognitive strategies (self-knowledge, planning and regulation, and assessment: with 17 items, maximum raw score 68 points, and maximum percentile 99) and support information strategies (self-instructions, self-control, and motivation: with 17 items, maximum raw score 68 points, and maximum percentile 99). Cronbach’s Alpha was used to test the reliability of the scales, with the following results: Scale I: ; Scale II: ; Scale III: ; and Scale IV: metacognition and information processing support strategies, . Likewise, Cronbach’s Alpha applied to the reliability of the instrument used for this study gave the following results: Scale I: ; Scale II: ; Scale III: ; and Scale IV: metacognition and information processing support strategies, .

2.2.2. Teaching Program Based on Self-Regulated Learning in Problem Solving Related to Structural Engineering

This program consisted of 6 topics. Topic 1: knowledge of materials used in building; Topic 2: design and estimation of specific structural elements; Topic 3: design and testing of structures; Topic 4: constructive systems for the execution of foundations; Topic 5: graphic representation of constructive elements; and Topic 6: the solution of foundation-related problems. These themes were structured around (1) an analysis of prior knowledge needed for the student to gain a significant grasp of the subject, (2) explanation of the topic in itself, (3) evaluation criteria of knowledge on the topic, (4) materials needed to work with the topic, (5) activities in the form of problems the teacher presents to approach a significant understanding of the topic, and (6) generalization of the activities, problems, and tasks other than those studied that permit the transference of acquired learning. We used a self-instructional methodology for the acquisition of these contents (both conceptual and procedural) based on the use of self-response questions. The teacher used self-regulated strategies with the students through self-administered questions. This methodology helped the students to interrelate both the conceptual and the procedural knowledge.

2.2.3. Questionnaire on Prior Knowledge (See Appendix A)

It consists of 22 items that analyze the conceptual knowledge of the student with regard to Structural Engineering and 7 items that evaluate the procedural knowledge of students in problem solving that relates to Structural Engineering.

2.2.4. Rubrics (See Appendix B)

A total of 6 rubrics evaluated the knowledge of students in the 6 topics described above (as an example, we present the rubrics corresponding to Topics 2 and 5 in Appendix B). The rubrics were used by students to regulate and to assess their own learning. The self-assessment results were also used by the teacher as a student feedback procedure on their own learning. Likewise, the teacher also used this strategy as a continuous assessment procedure on the outcomes of the student learning process.

2.3. Procedure

Before beginning the self-regulated learning program on problem-solving, the ACRA learning strategies scales were administered to the two groups of students, as well as the questionnaire on prior knowledge of Structural Engineering, and the rubric-based evaluation scale. Subsequently, the program took place over one term using a self-regulated learning methodology for problem solving. Finally, we administered a postprogram test of acquired knowledge using the same rubric-based evaluation scale.

3. Data Analyses

Descriptive statistics (mean and standard deviation) were calculated, as well as a fixed-effect ANOVA (the group following the self-regulation program versus the control group), and the effect value () was found. is the effect size, in other words, the variance that explains the variable or factor to study. It is the proportion of variance that explains the independent variable (factor) [22].

4. Results

A fixed-effect ANOVA was calculated for the learning strategies scale (ACRA) and previous knowledge on Structural Engineering, to test whether the groups were homogeneous before the program started. As we may see in Tables 2 and 3, significant differences were found neither between the experimental group and the control group nor between the learning strategies that students used and the previous knowledge on engineering structures before the program began. We may therefore conclude that they were homogeneous groups.

The first hypothesis stated that “significative differences will exist in conceptual and procedural knowledge of Structural Engineering following the teaching program on self-regulation between the experimental group and the control group.” A fixed-effect ANOVA was performed (self-regulation group versus control group). As may be seen in Table 4, significative differences were found after the program, which corroborated better procedural knowledge of Structural Engineering in the experimental group.

The second hypothesis stated that “significative differences will exist after the teaching program between the self-perceptions of students in the experimental group and in the control group with regard to their self-knowledge of Structural Engineering.”

As may be seen in Table 5, significative differences were found between knowledge of Topic 2 (evaluated with rubric 2) which refers to the design and estimation of specific structural elements and Topic 5 (evaluated with rubric 5) which refers to the graphic representation of constructive elements.

5. Conclusions

The teaching program has centered on self-regulated learning, feedback given to students on their learning processes, and the use of rubrics that facilitate student self-evaluation of their own learning. This program has improved the procedural skills of students and the fine-tuning of their own perceptions of their knowledge, especially with regard to knowledge that relates to the design and estimation of structural elements and the graphic representation of constructive elements [68, 10]. However, this type of methodology requires more homogeneous and longer programs before we may generalize its results (to all the subjects of the degree course). Likewise, we have highlighted the importance of the teacher forming an idea of the prior knowledge of students in the subject. Those measures lead to better harmonization of the study plan with the subject matter and the creation of more certain learning outcomes. Likewise, it is essential for students to become aware of the starting point when they embark on the study of a new subject and of their progress throughout the teaching-learning process. The use of rubrics by the teacher has shown itself to be effective, allowing both teacher and students to measure prior (conceptual and procedural) knowledge. Moreover, rubrics also assist the analysis of student progress in the teaching process. The final objective is to increase effective and independent student learning. This aspect is especially relevant for engineering courses, as the work of future engineers has to be autonomous, certain, and based on problem-solving and decision-making.

In future research, our intention is complete longitudinal studies that will provide insight into the relation between work on self-regulation and the efficiency at work of future graduates. We should point out that although we found significant differences, the size effect was low in all of them. These low values indicate the use of an effective self-regulatory approach by the teacher even though it fails to explain a high percentage of variance (level of prior knowledge before and after and perception of knowledge). This fact has to be analyzed over the duration of the training (one semester). In order to achieve higher effect values, teaching based on self-regulation should be an ongoing practice throughout the development of the degree course. Therefore, we propose as future research the expansion of self-regulation training procedures, involving more teachers from one degree course.

Appendices

A. Questionnaire on Prior Knowledge

Module: Structural EngineeringTeacher:Higher Polytechnic School:Degree: Technical Architecture

A Likert-type scale from 1 to 5 was used for the measurement of the responses, where 1 is “not at all sure” and 5 is “totally sure.”

Conceptual Knowledge(1)I can define the concept of a foundation.(2)I can define the parts of the building foundation project.(3)I can describe the essential keys of a geotechnical study.(4)I can define the concept of a footing.(5)I can complete a classification of different types of footings.(6)I can apply the concept of a footing in a foundation project.(7)I can define the concept of a slab.(8)I can apply the concept of a slab in a foundation project.(9)I can define the concept of a pile.(10)I can apply the concept of a pile in a foundation project.(11)I can define the concept of a shaft foundation.(12)I can apply the shaft foundation concept to a foundation project.(13)I can describe the concept of an isolated footing.(14)I can describe the concept of a combined footing.(15)I can describe the concept of the eccentric footing.(16)I can describe the concept of raft or mat footing.(17)I can describe the concept of a continuous footing.(18)I know the concept of admissible pressure and I can apply it to the different types of terrain.(19)I can describe the concept of subsidence pressure.(20)I can describe the concept of stress testing and I can apply it to the different types of foundations.(21)I can describe the concept of verification of bedding material and I can apply it to the measurement of different structures.(22)I can complete structural sizing on the basis of isolated footings.(23)I can complete structural sizing on the basis of rigid footings.(24)I can complete structural sizing on the basis of flexible footings.

Procedural Knowledge(1)I search for information by myself related to theory and practice to solve the proposed problems.(2)I use planning strategies to solve problems.(3)When I am unable to solve a problem in an acceptable way, I try to do it in another way, beginning with the mistakes I have made, with the intention of not repeating them.(4)I give myself orders when I try to solve problems, which guide the steps I take to resolve them.(5)When I cannot solve a problem, I do not abandon it and I try to solve it again.(6)When I am engaged in a problem, I try not to let myself be distracted.(7)When I cannot do a problem, I encourage myself to do problems that are more difficult.

B. Rubrics

Topic 2. Design and estimation of specific structural elements.

Competences 2. Design capability, calculation, and estimation of specific structural elements and knowledge capability and application of standards and regulations on building structures (see Table 6).

Topic 5. Graphic representation of constructive elements.

Competence 5. Capability to apply the draft to the constructive elements (see Table 7).

Abbreviations

EG:Experimental group
CG:Control group
M:Mean age
SD:Standard deviation.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.