Table of Contents Author Guidelines Submit a Manuscript
Mobile Information Systems
Volume 2018, Article ID 6047034, 8 pages
Research Article

An Education Application for Teaching Robot Arm Manipulator Concepts Using Augmented Reality

1Instituto Tecnológico Superior de Alvarado, Alvarado, VER, Mexico
2Instituto Tecnológico de Veracruz, Veracruz, VER, Mexico
3Universidad Politécnica de Victoria, 87138 Cd. Victoria, TAMPS, Mexico
4Universidad Politécnica de San Luis Potosí, 78363 San Luis Potosí, SLP, Mexico

Correspondence should be addressed to Carlos A. Calles-Arriaga; xm.ude.vpu@asellacc

Received 12 May 2018; Accepted 9 July 2018; Published 6 August 2018

Academic Editor: Byeong-Seok Shin

Copyright © 2018 Martín Hernández-Ordoñez et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Teaching robotics is a challenge in many universities due to the mathematics concepts used in this area. In recent years, augmented reality has improved learning in several engineering areas. In this paper, a platform for teaching robotic arm manipulation concepts is presented. The system includes a homemade robotic arm, a control system, and the RAR@pp. The RAR@pp is focused on learning robotic arm manipulation algorithms by the detection of markers in the robotic arm and displaying in real time the values based on the data obtained by the control system. Details on the design of the platform are presented, and the related results are discussed. Experimental data about the usability of the application are also shown.

1. Introduction

In the last decade, Information Technologies and Communication (ITC) have spread to all areas of the society. An example of this is the area of learning. The constant changes in the traditional methodology of teaching seek more productive methods, with the goal to improve the learning experience and increase the intellectual level of the students. Currently, the use of mobile devices has continuously been increased. The growth in the demand for cell phones and tablets is due to the evolution of technologies and the implementation of many functionalities. One of the new technologies in development is augmented reality (AR). This technology allows the coexistence of real and virtual objects in the same environment.

Among the different qualities of reality, technology increased its adaptability to a high number of scenarios. Augmented reality, due to its portability and usability, may be implemented in various equipment such as personal computers, mobile devices, and smartphones, among others. Besides, reality technology can be combined with other techniques to enrich the applications, emerging thus a tool to improve the teaching-learning process inside the classroom and laboratories [1]. An area with the potential to implement technology augmented reality is that of robotics since there are issues that require a 3D perception [2]. Nevertheless, traditional teaching methods make use of material didactic in 2D, which makes it difficult to understand these themes. For this reason, the present work shows the implementation of a robotic platform equipped with augmented reality technology. The purpose is to achieve in an educational setting an understanding of robotics topics which are more interactive.

In this work, a platform for teaching robotic arm manipulation concepts is presented. The system includes a homemade robotic arm, a control system, and the RAR@pp (Robotics through Augmented Reality Application). The RAR@pp is focused on learning robotic arm manipulation algorithms by the detection of markers in the robotic arm and the real-time visualization of the angles of each articulation of the homemade robotic arm using augmented reality. These angles are obtained from the robot based on the data obtained by the encoders of each motor and transmitted to the RAR@pp using Bluetooth communication. These applications allow students to send value to the robotic arm and visualize in real time the response of the arm to the com mands using the RAR@pp. This application is specifically designed for mechatronics, information technology, and manufacturing undergraduate programs. The incorporation of augmented reality technology into the mobile application allows real-time viewing of joint angles through virtual objects corresponding to the real angles in the physical platform. The robotic platform is programmed with a reference profile using a desktop application. And by means of a Proportional Integral Derivative controller applied to each joint, it is possible that the robot follows the provided trajectory, allowing the joints to reach the required angles. This paper is structured as follows. In Section 2, a concise review of the state of the art is given. In Section 3, the design and implementation of the proposed system is described. In Section 4, the experimental setup and results are described. Finally, in Section 5, final recalls are established, and future work is outlined.

2. State of the Art

Nowadays, augmented reality represents a potential solution to problems in several areas such as robotics [3], teaching [4], mobile apps [5], or medicine [6]. For instance, Clemente et al. [7] proposed an improvement in sensing feedback of a robotic hand by visual elements. Although other options are typically used as sensors in prostheses such as vibro- or electrotactile stimulations, one advantage of AR implementation is the increase in sensibility resolution and easiness to adapt in the real world. Medical procedures could also be benefited of AR and robotics combination. In [8], a robotic system was proposed for treatment of tumors with the ablation technique. AR implementation helped to increase precision and consistency, which could be reflected in better medical outcomes. Augmented reality can also be used for safety purposes. Quercioli [6] developed a new approach to visualize lasers without any risk to eye damage. The mechanism consists of a modified smartphone camera without the infrared filter and a Google cardboard-type viewer. The system was successfully implemented to obtain real-time images of a Nd:YAG laser and a Ti:sapphire oscillator, both at near-infrared emission which is widely used in medical procedures.

In [2], a desktop application named Build-A-Robot was developed to help students to understand the forward kinematics of serial robots arms. The app improves the visualization and configuration of a 3D robotic arm in a controlled (virtual) environment. Jara et al. [9] developed an e-learning system based on a robotic platform and a graphical user interface with an integrated AR module. This platform can be used to control a robot arm through the Internet. The system is capable of planning a path remotely utilizing augmented reality.

In a work oriented to improve students learning, Ibañez et al. [10] developed an electromagnetism experiment using augmented reality (AR). Questions associated with fundamental topics, for example, Coulomb’s law, electric field, and Ohm’s law, were prepared. A test was carried out in high school students comparing a web-based tool and the AR app in an experimental/control group. Although results showed that the students’ outcomes were statistically similar in both cases, the motivation to study was better for the AR app. This factor is also mentioned, along with creativity by Wei at al [4]. This is a very interesting finding since the students’ attitude is fundamental during the learning process. Higher education also has a big potential to implement augmented reality as a learning tool. Martín-Gutiérrez et al. [11] developed an app for training students in real electrical labs using AR. One advantage of this work is that it was designed to promote independent as well as collaborative work in laboratories, which could be helpful mainly in engineering environments.

AR has significant growth potential in advanced manufacturing developments; for example, Ni et al. [12] created the haptic robot for programming welding paths. The main advantages of this prototype were a user-friendly interface and the possibility to improve this work through seam tracking sensors. In another work related to programming, Collett and MacDonald [13], design a system to test AR debugging.

Speed of processing is a main concern in mobile applications. Ruan and Jeong [14] proposed and implemented the substitution of traditional markers that elaborated with ARToolkit [15] for the Quick Response (QR) code in an AR system. A related work of applying QR to AR apps was developed by Kan et al. [16]. Barcode has also been used in combination with AR for commercial purposes [17]. AR had also been used for broadcasting enhancement as mentioned by Yan and Hu [18] which in general depends on AR display, AR tracking, and robotic AR broadcast.

3. Proposed System

In this section, the proposed system is described. The system includes three main components: a homemade physical robotic platform, a 2 DoF robotic arm, and the required hardware for moving the robotic arm, including a control module which receives signals from desktop application and sends back the angles obtained by the encoders of the robotic arm. The second component is the desktop application, focused on allowing students to prototype different control algorithms, and the last one is the RAR@pp, for the visualization of the angles in real time.

The dataflow of the proposed system is shown in Figure 1. The details of each component are described below:(1)The desktop application sends commands to the robotic arm control using a USB serial protocol.(2)The robotic arm control sends back the angles of each articulation, and these angles are used for plotting a comparison graph between desired and real path. The robotic control generates the movement commands for each articulation, based on the information obtained from the desktop application.(3)The mobile application sends a connection request to the robotic arm using a Bluetooth channel. The robotic arm receives this request and grants the connection. Once the connection is established, the mobile application identifies the markers located in the robot arm and sends a request to the robotic arm controller.(4)Each robot articulation has a marker. In this specific case, due to the design of the robotic arm, only two markers were used, but this design can be extended to robotic arms with more than two articulations.(5)The mobile application receives the degree of each articulation and displays them using a virtual degree protractor.

Figure 1: Main components and dataflow of the proposed system.
3.1. Main Blocks of the Desktop App

The desktop application is designed for sending commands to the control system of the robotic arm using a serial USB connection and gets feedback about the angle of each articulation. This application was developed using Matlab/Simulink®, and the position of each articulation was independently programmed using a proportional-integral-derivative (PID) control. The desktop application also allows the user to perform simulation and generates comparison plots between real and desired trajectory, taking as input the times and angles for each articulation.

3.2. Main Blocks of the Control Module

In Figure 2, a flow diagram including the software routines of the control module is shown. These software modules are hosted on an Arduino board, with an HC-05 Bluetooth shield. Each one of these modules is described below:(i)Configuration routine. This module establishes the configuration parameters of both serial connections (Bluetooth and USB), the transfer speeds, and the pin modes required for external communication.(ii)Connection request manager. This module detects if there is at least one connection request by the mobile application or by the desktop application.(iii)Get base angle routine. This module translates the lecture obtained by the encoder located in the base articulation to digital values in order to generate the angle of the base. This angle is sent to the mobile application using the previous Bluetooth connection.(iv)Get arm angle routine. This module translates the lecture obtained by the encoder located in the arm articulation to digital values in order to generate the angle of the arm. This angle is sent to the mobile application using the previous Bluetooth connection.

Figure 2: Main blocks of the robotic arm control module.
3.3. Main Blocks of the Mobile App

The main blocks of the mobile application are shown in Figure 3. The details of each block are described below:(i)Wireless connection module (WCM). This module must allow users to detect if any compatible robot arm Bluetooth connection is available. If it is the first time connection, this module asks the user for linking the device with the platform. Once the platform has been linked, the module is ready to get data from the platform to the application, activating the rest of the application functionalities.(ii)Image acquisition module (IAM). This module serves as a bridge between the physical sensor and the application. The IAM accesses the camera buffer for accessing the image obtained by the camera sensor.(iii)Marker localization module (MLM). This module obtains the input video from the IAM and performs the marker localization using the previously obtained data in the training phase for the specific markers located in the platform. Once the marker has been located, then the obtained coordinates are stored and sent to the ARIM.(iv)Articulation degree module (ADM). This module performs the articulation degree requests depending on the located marker by the MLM. The WCM returns the angles for each located marker and passes to the ARIM.(v)Augmented reality integrator module (ARIM). This module takes as input the dimensional localization of the marker obtained by the MLM and the degree of each marker obtained by the ADM and generates the final image where the protractor with the obtained angles is overlaid on the image obtained by the IAM. Finally, this image is shown to the user on the device screen.

Figure 3: Main blocks of the mobile application.

4. Experimental Setups and Results

4.1. Hardware Tools

(i)Two 12 V DC motors with optical encoders.(ii)A L298N dual H-bridge motor driver.(iii)A 12 V 10 A power supply.(iv)A HC-05 Bluetooth module for communicating Arduino via Bluetooth communication with the AR application.(v)An Arduino ONE device for hosting the communication modules.(vi)Homemade robotic arm designed using SolidWorks® and built using material form local providers. The SolidWorks design is shown in Figure 4(a), and the final design is shown in Figure 4(b).(vii)A mobile device with back camera and Bluetooth adapter. In this device, the developed application is deployed, and several tests were performed in order to validate the localization of the markers and image visualization. The proposed application has been validated on devices with several Android versions. In Table 1, a list of devices used and its specifications used for testing the proposed application are shown. The application worked without problems in the listed devices.

Figure 4: Design and realization of the robotic arm: (a) robotic arm design with SolidWorks; (b) robotic arm prototype.
Table 1: Devices utilized for testing the proposed application.

In Figure 5, connections required among the Arduino board, the driver, and the power source are shown. The Arduino board hosts the control module and sends commands to the motor driver, and each motor is connected to the dual motor driver. The Arduino board receives commands for the robot arm from the desktop application through USB connection, and the angles of each arm articulation are sent to the mobile application using Bluetooth USB connection.

Figure 5: Integration of the components of the control module.
4.2. Software Tools

The desktop application was developed using Matlab and Simulink, on a desktop computer with Windows OS.

The mobile application was developed using Android Studio IDE with Android Studio SDK and Java SE Development Kit. The modules of the application were written in Java. In addition, ARToolkit was used for the marker recognition and integration of virtual and real objects in live capture obtained by the camera sensor of the device [15].

4.3. Designed Application

In Figure 6, the base (Figure 6(a)) and arm (Figure 6(b)) markers are shown. These markers were used for training the ARK toolkit classification model for its recognition.

Figure 6: Marker for the robot arm articulations and localization of each market by the AR demo: (a) base articulation marker; (b) arm articulation marker.

In Figure 7, the desktop application and the mobile application are shown.

Figure 7: Integration of the components of the proposed system: (a) robotic arm and desktop application for monitoring; (b) mobile application and its interaction with the robotic arm.

In Figure 8, two frames of the obtained angles of each articulation and its visualization in the APP are shown. From the obtained images, it is possible to conclude that the recognition of each marker is performed successfully, and the display of the protractor with the obtained angles is visible in each of the tested angles.

Figure 8: Robotic arm AR interface: (a) initial position; (b) real-time angle measurements.
4.4. Time and Angle Results

In order to verify the alignment precision of the two axes (base and shoulder), measurements were carried out using Matlab. Figure 9(a) shows results from base angle trajectories, where the position error ranges from 0.75° to 3.8°. In the case of shoulder axis (Figure 9(b)), variations are from 0.08° to 7.28°. These variations could be reduced by using some control strategies. With respect to time lag between sending the instructions and the display of the response, a delay from 1 to 2 seconds was observed in the experiments.

Figure 9: Real and desired paths for robotic arm axes: (a) base axis; (b) shoulder axis.

5. Final Recalls and Future Work

In this paper, an approach for teaching robotic arm manipulator concepts was proposed. The system includes a homemade robotic arm, a control system, and the RAR@pp. The RAR@pp is focused on learning robotic arm manipulation algorithms by the detection of markers in the robotic arm and the real-time visualization of the angles of each articulation of the homemade robotic arm using augmented reality. These angles are obtained from the robot based on the data obtained by the encoders of each motor and transmitted to the RAR@pp using Bluetooth communication. This application allows students to send configuration parameters to the robotic arm and visualize in real time the response of the arm to the commands using the RAR@pp. The application was tested in a large number of mobile devices, including smartphones and tablet devices.

The proposed platform allows capturing the student’s attention, facilitating the understanding of complex robotics and kinematics concepts. The implementation of a smartphone-based application will allow a large number of students to access to this type of educational resources, improving their performance and their understanding of key concepts of robotics and kinematics.

In this work, basic techniques combining augmented reality have been applied for the visualization of the articulation arm angles. Future work could be oriented to replace the use of markers for the direct recognition of the shape of the object for AR applications. The direct object recognition capabilities could be used for working with different types of arms and for larger degrees of freedom. Moreover, several sensors could be added to the robotic platform in order to expand its capabilities. For example, current sensors could be used to visualize the effect of torque in the axis motors. Implementation of pressure sensors in an end effector or inertial sensor to study the device behavior could also been implemented. It also can be added to the RAR@pp the capability to display information about motor’s temperature and the measurement of execution time for a specific task.

Data Availability

The data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare that they have no conflicts of interest.


The present project was partially funded by the National Council of Science and Technology of Mexico through a scholarship granted to Karla E. Bautista Hernández.


  1. G. Akçayır, H. M. Pektaş, and M. A. Ocak, “Augmented reality in science laboratories,” Computers in Human Behavior, vol. 57, pp. 334–342, 2016. View at Publisher · View at Google Scholar · View at Scopus
  2. M. Flanders and R. C. Kavanagh, “Build-a-robot: using virtual reality to visualize the Denavit-Hartenberg parameters,” Computer Applications in Engineering Education, vol. 23, no. 6, pp. 846–853, 2015. View at Publisher · View at Google Scholar · View at Scopus
  3. M. Stilman, P. Michel, J. Chestnutt, K. Nishiwaki, S. Kagami, and J. Kuffner, “Augmented reality for robot development and experimentation,” Robotics Institute, Carnegie Mellon University, Pittsburgh, PA, USA, 2005, Technical Report. View at Google Scholar
  4. X. Wei, D. Weng, Y. Liu, and Y. Wang, “Teaching based on augmented reality for a technical creative design course,” Computers and Education, vol. 81, pp. 221–234, 2015. View at Publisher · View at Google Scholar · View at Scopus
  5. W. Tarng, Y.-S. Lin, C.-P. Lin, and K.-L. Ou, “Development of a lunar-phase observation system based on augmented reality and mobile learning technologies,” Mobile Information Systems, vol. 2016, Article ID 8352791, 12 pages, 2016. View at Publisher · View at Google Scholar · View at Scopus
  6. F. Quercioli, “Augmented reality in laser laboratories,” Optics & Laser Technology, vol. 101, pp. 25–29, 2018. View at Publisher · View at Google Scholar · View at Scopus
  7. F. Clemente, S. Dosen, L. Lonini, M. Markovic, D. Farina, and C. Cipriani, “Humans can integrate augmented reality feedback in their sensorimotor control of a robotic hand,” IEEE Transactions on Human-Machine Systems, vol. 47, no. 4, pp. 583–589, 2015. View at Google Scholar
  8. L. Yang, C.-K. Chui, and S. Chang, “Design and development of an augmented reality robotic system for large tumor ablation,” International Journal of Virtual Reality, vol. 8, no. 1, pp. 27–35, 2015. View at Google Scholar
  9. C. A. Jara, F. A. Candelas-Herías, M. Fernández, and F. Torres, “An Augmented Reality Interface for Training Robotics through the Web,” in Proceedings of the 40th International Symposium on Robotics, Barcelona, Spain, 2009.
  10. M. B. Ibáñez, Á. D. Serio, D. Villarán, and C. D. Kloos, “Experimenting with electromagnetism using augmented reality: impact on flow student experience and educational effectiveness,” Computers & Education, vol. 71, pp. 1–13, 2014. View at Publisher · View at Google Scholar · View at Scopus
  11. J. Martín-Gutiérrez, P. Fabiani, W. Benesova, M. D. Meneses, and C. E. Mora, “Augmented reality to promote collaborative and autonomous learning in higher education,” Computers in Human Behavior, vol. 51, pp. 752–761, 2015. View at Publisher · View at Google Scholar · View at Scopus
  12. D. Ni, A. W. W. Yew, S. K. Ong, and A. Y. C. Nee, “Haptic and visual augmented reality interface for programming welding robots,” Advances in Manufacturing, vol. 5, no. 3, pp. 191–198, 2017. View at Google Scholar
  13. T. H. J. Collett and B. A. MacDonald, “An augmented reality debugging system for mobile robot software engineers,” Journal of Software Engineering for Robotics, vol. 1, no. 1, pp. 18–32, 2009. View at Google Scholar
  14. K. Ruan and H. Jeong, An Augmented Reality System Using QR Code as Marker in Android Smartphone, IEEE, Piscataway, NJ, USA, 2012.
  15. H. Kato, “ARtoolkit,” 2018, View at Google Scholar
  16. T.-W. Kan, C.-H. Teng, and W.-S. Chou, “Applying QR code in augmented reality applications,” in Proceedings of the 8th International Conference on Virtual Reality Continuum and Its Applications in Industry (VRCAI’09), pp. 253–257, Yokohama, Japan, 2009.
  17. J.-C. Chien, H.-Y. Lu, Y.-S. Wu, and L.-C. Liu, “ARtoolkit-based augmented reality system with integrated 1-d barcode: combining colorful markers with remote servers of 3D data for product promotion purposes,” in Proceedings of the Second International Conference on Computational Collective Intelligence: Technologies and Application (ICCCI’10), vol. 3, pp. 200–209, Kaohsiung, Taiwan, 2010.
  18. D. Yan and H. Hu, “Application of augmented reality and robotic technology in broadcasting: a survey,” Robotics, vol. 6, no. 3, p. 18, 2017. View at Publisher · View at Google Scholar · View at Scopus