Abstract

Teaching robotics is a challenge in many universities due to the mathematics concepts used in this area. In recent years, augmented reality has improved learning in several engineering areas. In this paper, a platform for teaching robotic arm manipulation concepts is presented. The system includes a homemade robotic arm, a control system, and the RAR@pp. The RAR@pp is focused on learning robotic arm manipulation algorithms by the detection of markers in the robotic arm and displaying in real time the values based on the data obtained by the control system. Details on the design of the platform are presented, and the related results are discussed. Experimental data about the usability of the application are also shown.

1. Introduction

In the last decade, Information Technologies and Communication (ITC) have spread to all areas of the society. An example of this is the area of learning. The constant changes in the traditional methodology of teaching seek more productive methods, with the goal to improve the learning experience and increase the intellectual level of the students. Currently, the use of mobile devices has continuously been increased. The growth in the demand for cell phones and tablets is due to the evolution of technologies and the implementation of many functionalities. One of the new technologies in development is augmented reality (AR). This technology allows the coexistence of real and virtual objects in the same environment.

Among the different qualities of reality, technology increased its adaptability to a high number of scenarios. Augmented reality, due to its portability and usability, may be implemented in various equipment such as personal computers, mobile devices, and smartphones, among others. Besides, reality technology can be combined with other techniques to enrich the applications, emerging thus a tool to improve the teaching-learning process inside the classroom and laboratories [1]. An area with the potential to implement technology augmented reality is that of robotics since there are issues that require a 3D perception [2]. Nevertheless, traditional teaching methods make use of material didactic in 2D, which makes it difficult to understand these themes. For this reason, the present work shows the implementation of a robotic platform equipped with augmented reality technology. The purpose is to achieve in an educational setting an understanding of robotics topics which are more interactive.

In this work, a platform for teaching robotic arm manipulation concepts is presented. The system includes a homemade robotic arm, a control system, and the RAR@pp (Robotics through Augmented Reality Application). The RAR@pp is focused on learning robotic arm manipulation algorithms by the detection of markers in the robotic arm and the real-time visualization of the angles of each articulation of the homemade robotic arm using augmented reality. These angles are obtained from the robot based on the data obtained by the encoders of each motor and transmitted to the RAR@pp using Bluetooth communication. These applications allow students to send value to the robotic arm and visualize in real time the response of the arm to the com mands using the RAR@pp. This application is specifically designed for mechatronics, information technology, and manufacturing undergraduate programs. The incorporation of augmented reality technology into the mobile application allows real-time viewing of joint angles through virtual objects corresponding to the real angles in the physical platform. The robotic platform is programmed with a reference profile using a desktop application. And by means of a Proportional Integral Derivative controller applied to each joint, it is possible that the robot follows the provided trajectory, allowing the joints to reach the required angles. This paper is structured as follows. In Section 2, a concise review of the state of the art is given. In Section 3, the design and implementation of the proposed system is described. In Section 4, the experimental setup and results are described. Finally, in Section 5, final recalls are established, and future work is outlined.

2. State of the Art

Nowadays, augmented reality represents a potential solution to problems in several areas such as robotics [3], teaching [4], mobile apps [5], or medicine [6]. For instance, Clemente et al. [7] proposed an improvement in sensing feedback of a robotic hand by visual elements. Although other options are typically used as sensors in prostheses such as vibro- or electrotactile stimulations, one advantage of AR implementation is the increase in sensibility resolution and easiness to adapt in the real world. Medical procedures could also be benefited of AR and robotics combination. In [8], a robotic system was proposed for treatment of tumors with the ablation technique. AR implementation helped to increase precision and consistency, which could be reflected in better medical outcomes. Augmented reality can also be used for safety purposes. Quercioli [6] developed a new approach to visualize lasers without any risk to eye damage. The mechanism consists of a modified smartphone camera without the infrared filter and a Google cardboard-type viewer. The system was successfully implemented to obtain real-time images of a Nd:YAG laser and a Ti:sapphire oscillator, both at near-infrared emission which is widely used in medical procedures.

In [2], a desktop application named Build-A-Robot was developed to help students to understand the forward kinematics of serial robots arms. The app improves the visualization and configuration of a 3D robotic arm in a controlled (virtual) environment. Jara et al. [9] developed an e-learning system based on a robotic platform and a graphical user interface with an integrated AR module. This platform can be used to control a robot arm through the Internet. The system is capable of planning a path remotely utilizing augmented reality.

In a work oriented to improve students learning, Ibañez et al. [10] developed an electromagnetism experiment using augmented reality (AR). Questions associated with fundamental topics, for example, Coulomb’s law, electric field, and Ohm’s law, were prepared. A test was carried out in high school students comparing a web-based tool and the AR app in an experimental/control group. Although results showed that the students’ outcomes were statistically similar in both cases, the motivation to study was better for the AR app. This factor is also mentioned, along with creativity by Wei at al [4]. This is a very interesting finding since the students’ attitude is fundamental during the learning process. Higher education also has a big potential to implement augmented reality as a learning tool. Martín-Gutiérrez et al. [11] developed an app for training students in real electrical labs using AR. One advantage of this work is that it was designed to promote independent as well as collaborative work in laboratories, which could be helpful mainly in engineering environments.

AR has significant growth potential in advanced manufacturing developments; for example, Ni et al. [12] created the haptic robot for programming welding paths. The main advantages of this prototype were a user-friendly interface and the possibility to improve this work through seam tracking sensors. In another work related to programming, Collett and MacDonald [13], design a system to test AR debugging.

Speed of processing is a main concern in mobile applications. Ruan and Jeong [14] proposed and implemented the substitution of traditional markers that elaborated with ARToolkit [15] for the Quick Response (QR) code in an AR system. A related work of applying QR to AR apps was developed by Kan et al. [16]. Barcode has also been used in combination with AR for commercial purposes [17]. AR had also been used for broadcasting enhancement as mentioned by Yan and Hu [18] which in general depends on AR display, AR tracking, and robotic AR broadcast.

3. Proposed System

In this section, the proposed system is described. The system includes three main components: a homemade physical robotic platform, a 2 DoF robotic arm, and the required hardware for moving the robotic arm, including a control module which receives signals from desktop application and sends back the angles obtained by the encoders of the robotic arm. The second component is the desktop application, focused on allowing students to prototype different control algorithms, and the last one is the RAR@pp, for the visualization of the angles in real time.

The dataflow of the proposed system is shown in Figure 1. The details of each component are described below:(1)The desktop application sends commands to the robotic arm control using a USB serial protocol.(2)The robotic arm control sends back the angles of each articulation, and these angles are used for plotting a comparison graph between desired and real path. The robotic control generates the movement commands for each articulation, based on the information obtained from the desktop application.(3)The mobile application sends a connection request to the robotic arm using a Bluetooth channel. The robotic arm receives this request and grants the connection. Once the connection is established, the mobile application identifies the markers located in the robot arm and sends a request to the robotic arm controller.(4)Each robot articulation has a marker. In this specific case, due to the design of the robotic arm, only two markers were used, but this design can be extended to robotic arms with more than two articulations.(5)The mobile application receives the degree of each articulation and displays them using a virtual degree protractor.

3.1. Main Blocks of the Desktop App

The desktop application is designed for sending commands to the control system of the robotic arm using a serial USB connection and gets feedback about the angle of each articulation. This application was developed using Matlab/Simulink®, and the position of each articulation was independently programmed using a proportional-integral-derivative (PID) control. The desktop application also allows the user to perform simulation and generates comparison plots between real and desired trajectory, taking as input the times and angles for each articulation.

3.2. Main Blocks of the Control Module

In Figure 2, a flow diagram including the software routines of the control module is shown. These software modules are hosted on an Arduino board, with an HC-05 Bluetooth shield. Each one of these modules is described below:(i)Configuration routine. This module establishes the configuration parameters of both serial connections (Bluetooth and USB), the transfer speeds, and the pin modes required for external communication.(ii)Connection request manager. This module detects if there is at least one connection request by the mobile application or by the desktop application.(iii)Get base angle routine. This module translates the lecture obtained by the encoder located in the base articulation to digital values in order to generate the angle of the base. This angle is sent to the mobile application using the previous Bluetooth connection.(iv)Get arm angle routine. This module translates the lecture obtained by the encoder located in the arm articulation to digital values in order to generate the angle of the arm. This angle is sent to the mobile application using the previous Bluetooth connection.

3.3. Main Blocks of the Mobile App

The main blocks of the mobile application are shown in Figure 3. The details of each block are described below:(i)Wireless connection module (WCM). This module must allow users to detect if any compatible robot arm Bluetooth connection is available. If it is the first time connection, this module asks the user for linking the device with the platform. Once the platform has been linked, the module is ready to get data from the platform to the application, activating the rest of the application functionalities.(ii)Image acquisition module (IAM). This module serves as a bridge between the physical sensor and the application. The IAM accesses the camera buffer for accessing the image obtained by the camera sensor.(iii)Marker localization module (MLM). This module obtains the input video from the IAM and performs the marker localization using the previously obtained data in the training phase for the specific markers located in the platform. Once the marker has been located, then the obtained coordinates are stored and sent to the ARIM.(iv)Articulation degree module (ADM). This module performs the articulation degree requests depending on the located marker by the MLM. The WCM returns the angles for each located marker and passes to the ARIM.(v)Augmented reality integrator module (ARIM). This module takes as input the dimensional localization of the marker obtained by the MLM and the degree of each marker obtained by the ADM and generates the final image where the protractor with the obtained angles is overlaid on the image obtained by the IAM. Finally, this image is shown to the user on the device screen.

4. Experimental Setups and Results

4.1. Hardware Tools

(i)Two 12 V DC motors with optical encoders.(ii)A L298N dual H-bridge motor driver.(iii)A 12 V 10 A power supply.(iv)A HC-05 Bluetooth module for communicating Arduino via Bluetooth communication with the AR application.(v)An Arduino ONE device for hosting the communication modules.(vi)Homemade robotic arm designed using SolidWorks® and built using material form local providers. The SolidWorks design is shown in Figure 4(a), and the final design is shown in Figure 4(b).(vii)A mobile device with back camera and Bluetooth adapter. In this device, the developed application is deployed, and several tests were performed in order to validate the localization of the markers and image visualization. The proposed application has been validated on devices with several Android versions. In Table 1, a list of devices used and its specifications used for testing the proposed application are shown. The application worked without problems in the listed devices.

In Figure 5, connections required among the Arduino board, the driver, and the power source are shown. The Arduino board hosts the control module and sends commands to the motor driver, and each motor is connected to the dual motor driver. The Arduino board receives commands for the robot arm from the desktop application through USB connection, and the angles of each arm articulation are sent to the mobile application using Bluetooth USB connection.

4.2. Software Tools

The desktop application was developed using Matlab and Simulink, on a desktop computer with Windows OS.

The mobile application was developed using Android Studio IDE with Android Studio SDK and Java SE Development Kit. The modules of the application were written in Java. In addition, ARToolkit was used for the marker recognition and integration of virtual and real objects in live capture obtained by the camera sensor of the device [15].

4.3. Designed Application

In Figure 6, the base (Figure 6(a)) and arm (Figure 6(b)) markers are shown. These markers were used for training the ARK toolkit classification model for its recognition.

In Figure 7, the desktop application and the mobile application are shown.

In Figure 8, two frames of the obtained angles of each articulation and its visualization in the APP are shown. From the obtained images, it is possible to conclude that the recognition of each marker is performed successfully, and the display of the protractor with the obtained angles is visible in each of the tested angles.

4.4. Time and Angle Results

In order to verify the alignment precision of the two axes (base and shoulder), measurements were carried out using Matlab. Figure 9(a) shows results from base angle trajectories, where the position error ranges from 0.75° to 3.8°. In the case of shoulder axis (Figure 9(b)), variations are from 0.08° to 7.28°. These variations could be reduced by using some control strategies. With respect to time lag between sending the instructions and the display of the response, a delay from 1 to 2 seconds was observed in the experiments.

5. Final Recalls and Future Work

In this paper, an approach for teaching robotic arm manipulator concepts was proposed. The system includes a homemade robotic arm, a control system, and the RAR@pp. The RAR@pp is focused on learning robotic arm manipulation algorithms by the detection of markers in the robotic arm and the real-time visualization of the angles of each articulation of the homemade robotic arm using augmented reality. These angles are obtained from the robot based on the data obtained by the encoders of each motor and transmitted to the RAR@pp using Bluetooth communication. This application allows students to send configuration parameters to the robotic arm and visualize in real time the response of the arm to the commands using the RAR@pp. The application was tested in a large number of mobile devices, including smartphones and tablet devices.

The proposed platform allows capturing the student’s attention, facilitating the understanding of complex robotics and kinematics concepts. The implementation of a smartphone-based application will allow a large number of students to access to this type of educational resources, improving their performance and their understanding of key concepts of robotics and kinematics.

In this work, basic techniques combining augmented reality have been applied for the visualization of the articulation arm angles. Future work could be oriented to replace the use of markers for the direct recognition of the shape of the object for AR applications. The direct object recognition capabilities could be used for working with different types of arms and for larger degrees of freedom. Moreover, several sensors could be added to the robotic platform in order to expand its capabilities. For example, current sensors could be used to visualize the effect of torque in the axis motors. Implementation of pressure sensors in an end effector or inertial sensor to study the device behavior could also been implemented. It also can be added to the RAR@pp the capability to display information about motor’s temperature and the measurement of execution time for a specific task.

Data Availability

The data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

The present project was partially funded by the National Council of Science and Technology of Mexico through a scholarship granted to Karla E. Bautista Hernández.