Abstract

Current trends in Robotics aim to close the gap that separates technology and humans, bringing novel robotic devices in order to improve human performance. Although robotic exoskeletons represent a breakthrough in mobility enhancement, there are design challenges related to the forces exerted to the users’ joints that result in severe injuries. This occurs due to the fact that most of the current developments consider the joints as noninvariant rotational axes. This paper proposes the use of commercial vision systems in order to perform biomimetic joint design for robotic exoskeletons. This work proposes a kinematic model based on irregular shaped cams as the joint mechanism that emulates the bone-to-bone joints in the human body. The paper follows a geometric approach for determining the location of the instantaneous center of rotation in order to design the cam contours. Furthermore, the use of a commercial vision system is proposed as the main measurement tool due to its noninvasive feature and for allowing subjects under measurement to move freely. The application of this method resulted in relevant information about the displacements of the instantaneous center of rotation at the human knee joint.

1. Introduction

An exoskeleton is commonly known in the literature as an external structure mechanism with joints and links corresponding to those of the human body [1]. Exoskeletons are usually classified into two main groups: (i) passive, where the force and torques are external to the device, and (ii) active, where the device actuators exert the forces that drive the device and user. Exoskeletons have applications in physiotherapy, as haptic or assistive devices [1].

Several exoskeletons have been developed in recent years; for example, Aguirre-Ollinger et al. [2] introduced a stationary one degree of freedom (DoF) lower limb exoskeleton for assisting knee flexion and extension exercises. Kinnaird and Ferris [3] proposed a medial gastrocnemius robotic ankle exoskeleton that is actuated with a pneumatic cylinder. Lopez et al. [4] developed a lower limb robotic exoskeleton with two DoF, actuating the knee and the ankle joint. The Vanderbilt powered orthosis [5], which is an active lower limb exoskeleton with two DoF for each leg, has been used as a platform for other research projects including controlled assisted locomotion [6] and stair climbing for paraplegic users [7]. Bortole et al. [8] developed a robotic exoskeleton for poststroke patients which is used for the overground gait rehabilitation at the critical months after the incident. van Ham et al. [9] developed a mechanically adjustable compliance and controllable equilibrium position actuator (MACCEPA) to actively actuate the knee joint by performing specific rehabilitation movements.

In addition to the abovementioned examples, there are several commercial robotic exoskeleton devices in the market. For example, the Swiss company Hocoma [10] developed the Lokomat, which is a gait training robotic exoskeleton that is extensively used on treadmill rehabilitation for patients with diverse lower limb disabling conditions. Honda’s Walking Assist device [11] is a portable exoskeleton, consisting of a waist frame that connects with two thigh frames, where each provides one DoF by using brushless DC motors. The Japanese company Cyberdyne is worldwide known for the development of the HAL (acronym for hybrid assistive limb) exoskeleton [12], which is used for either task support, rehabilitation, or welfare applications. The HAL robot has been extensively studied, for example, the synchronization control that is based on the position change of the center of ground reaction force (CoGRF) by Tsukahara et al. [12], which later added a speed estimator to this control model [13], or the work presented by Hassan et al. [14] that included the application of HAL control that is based on combining both lower and upper limb pose signals. Furthermore, the Indego exoskeleton is another good example of commercial devices [15] that are based on the Vanderbilt exoskeleton [5] and is a powered lower limb orthosis for ground gait training. The Ekso from Ekso Bionics [16] has been developed by the Berkeley Robotics & Human Engineering Laboratory [17] as an evolution of the exoskeleton robots presented by Kazerooni [18]. Ekso is presented as a wearable bionic suit that enables lower limb disabled subjects to stand and walk overground. This device is intended for medically supervised use, providing support to lower limb paralyses patients and retraining them in the gait exercise. ReWalk Robotics has produced the first approved FDA exoskeleton that similar to other devices (e.g., Indego and Ekso Bionics) includes four actuated DoF (hips and knees joints) and is programmed with normalized gait patterns which provides proper muscular training to its users [19]. Finally, the REX from the Rex Bionics Group [20] is currently the unique device that provides full body weight support in equilibrium without the use of canes.

Most of the current advances of robotic exoskeletons have been performed by considering the joints as noninvariant rotational axis. This design practice however can lead to further injury to the user since the human joints are complex systems composed by bones that are driven by tendons, thus presenting motions whose rotational axes do not remain in the same location. For this issue, a common practice of designers is to use Biomimetics [21] (i.e., the search for inspiration in nature in order to find the solution to a given problem) for the development of exoskeletons. For instance, Mizoguchi et al. [22] designed and developed a musculoskeletal humanoid robot based on the actual musculoskeletal arrangement on the human hips. Zhu et al. [23] developed a lower extremity exoskeleton design with 15 DoF, selecting each degree of freedom by analyzing the human lower limb’s joints as spherical joints. Although they accepted this joints implementation as nonideal to emulate human joints, their use was neglected due to the high difficulty for its implementation. Miranda et al. [24] designed and implemented a robotic upper limb exoskeleton (elbow) with bioinspired actuators. Furthermore, Zhu et al. [25] proposed a biomimetic knee exoskeleton, focusing on the energetic aspects of the segmented foot design, in which they embedded a compliant joint in order to upgrade human walking of assistance exoskeletons.

It is important to note that the majority of novel designs on lower limb robotic exoskeletons have focused on the knee joint, which is usually the actuated joint. Other designs focus on the actuation of the hip and ankle, but in almost all cases those joints are mechanically designed as simple rotational axis. This paper conducts a detailed study of the kinematics of the knee joint, proposing the use of commercial vision systems in order to perform biomimetic joint design for robotic exoskeletons. The main contribution of this work is the proposition of an equivalent kinematic model that is based on irregular shaped cams that emulates the bone-to-bone joints in the human body, whose contour is obtained by the identification of the instantaneous center of rotation (ICR). Furthermore, the use of a commercial vision system is proposed as the main measurement tool, resulting in relevant information about ICR displacements at the human knee joint. This work can be applied as an approach for the ergonomic design of exoskeleton devices. The remainder of the paper is organized as follows: Section 2 presents the kinematic model that is based on the geometrical determination of the ICR of two bodies in contact. Section 3 presents the experimental setup that has been prepared in order to use a commercial vision system for tracking the leg motion, which includes the prototype development and data processing. Section 4 presents a detailed explanation about the algorithms that are used to determine the ICR of the knee joint using the resulting data from the vision system. Section 5 presents the experimental results of this work. Finally, Section 6 presents important conclusions and suggestions of future work.

2. Geometrical Model of the Knee Joint Based on ICR

According to the kinematics theory of rigid bodies and considering planar motion, the velocity of every point in a rigid body can be expressed as if this body spins on a plane about a given axis that is perpendicular to the plane [26]. Furthermore, this rotational axis intersects those bodies plane on its ICR.

Figure 1 illustrates the ICR or IC (instantaneous center) over planar motion of rigid bodies. Note that the vectors and , which represent velocities over specific points in that body (points and on a bar ), are instantaneously equivalent to the velocity of points and spinning around a fixed point (i.e., IC), where a line intersects each point of the body and the IC, while at the same time being perpendicular to the velocity vectors.

As discussed before in this paper, most robotic exoskeletons are mechanically modeled as single rotational joints. However, the majority of human joints have more complex kinematic models that imply rotation and even translation movements over different planes. The knee joint is known to have 1 DoF but due to its muscles and tendon driven rotation over a bone-to-bone contact, complex rotations and translations are expected to happen, differently to the single planar rotation that is commonly applied to exoskeletons.

Figure 2(a) illustrates a sagittal plane view of the human knee joint. This paper focuses on the fact that both tibia and femur maintain a close-to-tangential contact (with its tissue interface) where the knee rotation is produced. Furthermore, considering the femur as the reference static body, the tibia rotates about the femur ending contour, while the contact point (denoted as IC) changes its position. This particular motion in the sagittal plane could be represented as a cam system in which one cam remains static and the other rotates around the static cam contour. Furthermore, Figure 2(b) illustrates the proposed equivalent kinematic model of the abovementioned knee joint, which is represented as two rotational joints that are connected to a prismatic joint as a serial open chain.

This irregular shaped characteristic emulates a cam system behavior, in which the distance between the cam axis and the contact point varies depending on the cam’s rotation angle. Consider that the ICR will lie on the line that intersects and (the reference points in the femur and tibia, resp.), while and represent the angles between the grounded link and the 2nd link and between the 2nd and 3rd link, respectively,where represents the variable distance between cam axes, is the static cam contact radius at a specific value, is the planetary cam contact radius, and . The radius ratio is calculated as follows:and can be further expressed as

In a practical way, it is necessary to have information about the velocity of the rigid body in order to be able to find the ICR. Nevertheless, for obtaining the velocity it is necessary to have information about the position between two instants. Figure 3 illustrates the determination of the ICR to the system shown in Figure 2, where and represent points of the rigid body at instants and , while represents the rigid body’s IC between those instants.

In order to complete this model, it is necessary to have specific values of and for every . This would be possible by selecting a fixed point over the cam on the femur’s ending and by giving a value of for every point over the irregular contour (see Figure 2(a)). Hence, will be given by the distance between the fixed point and the point IC, while will be given by .

The next section defines the experimental setup that will be used in this paper in order to obtain the abovementioned data for the calculation of the ICR of the joint.

3. Experimental Setup Using the VICON Vision System

This section presents the experimental procedure followed in order to develop, apply, and adjust a biomimetic design method for obtaining a mechanic model of a knee joint for robotic exoskeletons.

This work uses the VICON vision system, which is a motion capture system based on marker tracking. This system consists in a set of cameras that sense infrared light, emitted by the camera and reflected on retro reflective spherical markers that are positioned over the tracked moving body. This vision system uses homography based self-location on its set of cameras to create a static workspace in which a marker could be located. Based on triangulation position, the VICON system is able to locate a marker at every observed frame by at least two cameras. Through the NEXUS software, VICON users are able to create complex models of marker configurations, establish relationships amongst them, and label markers in order to provide individual specific data of position, velocity, and acceleration for a given time.

Figure 4 illustrates the experimental setup used in this paper. First, motion data is obtained from a subject with the objective to fit the kinematic model previously proposed. However, this model states the need of having both a static reference and a moving body. In order to fulfill this requirement, the femoral area (thigh) is selected to be the relative static reference body by applying a markers frame that will be later transformed into a global reference frame. Furthermore, the tibia area (calf) is selected as the moving body, where a second markers frame will be applied. Data records will be obtained and exported from every marker that is mounted on the subject using the VICON system to be later processed with the use of external software (LabVIEW).

Since the goal of this work is to obtain proper data to be applied on exoskeleton joints construction, a reverse engineering method is used in which the rigid interfaces act as exoskeleton links (one for the thigh and the other for the calf) and the free space over the knee acts as the black box that needs to be modeled. Hence, markers fixtures are used as rigid interfaces in order to reach two main goals: (i) to act as solid mechanical links of an exoskeleton and (ii) to properly place markers in a useful arrangement in order to ease the subsequent data processing. The thigh markers plate must be designed in order to create a global reference frame at its markers’ position over a common plane. Moreover, the calf markers plate should be parallel to the thigh plane and provide the mobile markers that define the ICR.

Figures 5(a) and 5(b) show the computer assisted design (CAD) and the 3D printed parts for the thigh and calf markers, respectively. For the thigh markers plate, it is necessary to have at least three markers in order to create a full reference frame; however two redundant marker spaces are included in order to select the best set out of three at any given frame positioning. For the lower limb markers plate, at least two markers are required since the ICR geometric approach is based on lines made for every mobile marker and the lines intersection, but in order to obtain comparative data, at least three markers should be used. Note that a total of seven marker spaces are included in order to select the best set of markers to be applied at the determination of the ICR. Markers labeled from to are specially selected due to the common space between them and their arc pattern, and marker is also selected as the calf angle marker. Markers , , and are selected in order to proof their efficiency at pseudorandom positions.

For this experiment, the VICON recordings data are processed with a low pass digital filter with a cutoff frequency of 2 Hz, considering that the maximum angular frequency to be reached by the subject at leg swinging will be 1 Hz while recording.

The next section presents the algorithms that have been used in order to determine the axis motion of the knee joint using the VICON vision system.

4. Knee Joint Axis Motion Tracking

The inverse modeling of the knee joint includes the collection of the vision system data, followed by the geometrical determination of the ICR by considering the system proposed in Section 2. For this purpose, the final ICR detection algorithm is divided into the following principal computing processes:(1)Point-Slope Bisectors Determination. This process obtains an array of line elements (point and slope) to be applied as bisectors for every marker and motion frame.(2)Bisectors Set Visualization. Using the array created in 1, lines are projected from their origin (middle point) to visualize intersections between them.(3)ICR Detection and Tracking. Intersections are obtained by matching the bisector lines equations.

The abovementioned computing processes are fully detailed in the following sections.

4.1. Point-Slope Bisectors Determination

The application of the ICR geometric approach implies that every point that is being analyzed lies on the same common plane. Thus, the first step to be followed is to project the position of every point to a single plane. Since the data is normalized, all the thigh fixture points are already lying on plane and the calf fixture points are in a common plane and nearly parallel to the thigh plane. Consider that every point is projected to plane by eliminating its -axis position component; namely,where is the matrix containing a data set of point positions at the frame . Note that those point positions are values from the vector, which has position values at every frame for a specific marker ( to and to ). Furthermore, for the normalized data and considering the component to be close to zero, projecting to the plane results in

The bisectors line elements (point and slope) are acquired for every marker position and lie on a common plane. being the total array of data obtained from the data arrangement process ( 3D array), denotes the frame number that takes values from 1 to (i.e., the number of frames for a specific recorded session), and taking as the position of marker over plane at the motion frame , the new data set of bisectors for every marker is obtained fromwhere is the array representing a set of bisector line elements ( where denotes the marker) at the frame . This frame is obtained from every pair of consecutive motion frames and in order to apply the geometric approach to the ICR position. Thuswhere is the bisector slope created from points and , and are components, and is the position vector of the middle distance point between and , being its position along the -axis and the position along the -axis. Note that will be the negative reciprocal from the slope created from the position change.

Due to the fact that for every set of bisectors elements it is necessary to have and set of markers position, a complete 3D data array of bisectors for every marker and at every motion frame (called ) is obtained with a size of .

In order to have angular position data of the calf that is relative to the thigh at every frame, additional line elements (calf or lower leg parallel line (LL)) are indexed after from (3) for every motion frame. Note that, at the point where the slope lays, the reference point value equals the position in (5) at the current frame, and is set to be parallel to the tibia by design. Considering a line that is parallel to the subject tibia and crossing the marker position will create constant angles relative to any line created between and any other marker at the calf fixture (see Figure 6).

Adding the last elements to the resulting 3D array results in the following representation for :

4.2. Bisectors Set Visualization

Once the bisectors line elements are obtained, it is useful to have a visual of the most important segments of these lines in order to observe their behavior and the possible instant location of the knee joint axis.

This previous visualization can show possible recording errors that were too small to be seen at the VICON recorded data visualization.

The position changes are used to obtain movement directions (slopes) and are expected to be as minimum as possible (in order to maintain the ICR detection error value close to zero), and hence minimal errors could be reflected as major direction errors when obtaining and projecting perpendicular lines from those points to an expected ICR position. Previous bisectors visualization helps to identify this recording error, avoiding misleading results.

In order to obtain significant line segments, these are projected from their reference point in (7) and to the region where the ICR is expected to be found. This projection is done at a useful range of distances and looking to avoid chart saturation.

Line plotting is done by getting two points over the chart space. The first point will be equal to the vector given for every marker, while the second point is to be found using the point and slope line equation, where

A minimum distance from point to point equal to 400 mm is used depending on the slope value, and for a set of bisector line elements taken from a full motion frame set , it is possible to find the second point by

Substituting (10) in (9) results inNote that the value is determined where and define a 400 mm region that encloses the expected ICR; thus depending on this, will be higher or lower than by 400. Also, if the difference between and takes a higher value than 400, this second point is given bywhereand is determined in a similar manner compared to in (10) by making the expected ICR an enclosing region. This algorithm is repeated for every set of bisector line elements from the array in (8) and at every motion frame. Figure 7 shows the resulting plot of bisector lines for a frame. Also line is plotted in order to have a visualization of the leg angle relative to the thigh (horizontal line ). It is important to highlight that due to the nonfully controlled position of the T2 marker and the expected ICR displacement over the motion process the line whose elements are equal to is not necessarily expected to intersect the other lines at the same region.

4.3. ICR Detection and Tracking

The final step to be followed for the proper allocation of the ICR is presented in this section. At this moment four bisector lines are given due to the markers discrimination step, which brings a total of six possible intersections. Considering low amplitude noises, a final ICR will be given by averaging those six intersection position values.

In order to apply an intersection algorithm, it is necessary to make line equations equal between them to find a first axis value and then apply this value to any of those equations again to obtain the following. Furthermore, line equations are constructed with their corresponding line elements from (8) usingSimilar to the point and slope line equation, the general equation is being the value of when (-intersection) and solving for and results inwhere , , and are the elements from , which is the set of bisector line elements of marker and is part of the full set of bisectors elements that corresponds to the motion frame . Then the representation of an intersection between two lines from (15), where variables and are equal for both lines, iswhere and are two different markers from the same fixture and at the same motion frame and is equal to . Furthermore, solving (17) for , the final equation for the first axis value () isand finally, substituting in any other of the two line equations from (15) in order to obtain the second axis value () gives

ICR position values, as mentioned before, must be obtained for every intersection (six intersections for this case of four bisector lines) and then averaged into a single position vector for every frame given bywhere and are the components of the averaged ICR position for every marker at - and -axes, respectively, obtained at the same frame .

Also, in order to have functional design information, it is necessary to obtain the knee angle for every ICR location. This way and from the same motion frame bisectors data set , the knee angle is obtained by using the line elements from and projecting a line in order to have position value differences at the same line for both - and -axes. Solving from (8), these values are obtained by projecting the line created with elements to the -axis () withwhile is the component from the line elements in (9), is the slope of this same line, and is the value at the -axis intersection. Then, use trigonometric functions in order to obtain the knee angle withwhere and correspond to the and dimension values, respectively, of the line that intersects these two points, resulting in a angle value obtained from

Once these values are obtained for every frame from 2 to (the first frame is eliminated from the point and slope detection), a new array of final ICR data is constructed withwhere represents the final array of ICR data that will be applied to the joint design, while contains these values obtained from (20) and (23) for a specific frame .

5. Experimental Results

The experimental design developed for this application resulted in a reliable tool to model the human leg behavior as two moving rigid bodies. The use of an exoskeleton like marker fixtures also facilitated the experimental application at the recording data process due to their mounting easiness.

The measurement tool VICON that was used to capture motion data over a 3D space from a subject of interest showed good performance under correct calibration and environment. Performance indexes were obtained from the squared errors of distances between markers by comparing values from design and readings from VICON for every recorded frame. From the first recording iteration, VICON showed significant noises that reached squared error value peaks that were up to 40 mm2, producing almost useless data at the moment of developing the geometric approach. However, minimum changes in VICON environment and recording parameters showed major improvements in recording results where the squared error values remained constantly close to 5 mm2 (see Figure 8).

The use of normalization showed high importance in the model fitting process. In order to obtain relative movements of the calf about the thigh, the process of normalization was a highly reliable tool. Figure 9 shows the resulting normalization of a session recording at different frames of the process that enables the establishment of the global reference frame of the thigh fixture and the subsequent relative movement of the calf.

Also, this process showed that is preferable to make the whole recording process in the same quadrant of the VICON’s 3D workspace in order to simplify normalization computing process and the correct transformation angles selection.

The markers at the thigh fixture showed an acceptable stationary body behavior while other markers move around them as expected. After normalization, low amplitude high frequency noises were detected along the visualization process. The filtering process was implemented in order to eliminate the effect of these noises over the computation of bisectors, and performance index time series were plotted for every recorded session (see Figure 10), resulting in important reduction of the squared error peaks of up to 50% when compared with the nonfiltered sessions.

Figure 10 shows the performance index time series that were obtained before and after applying the low pass filter (2 Hz cutoff frequency). The performance indexes were a measurement of the VICON recording efficiency for specific sessions. As long as the error does not increase considerably, the VICON recording is considered to be efficient and reliable for this work. Also, if new error peaks appear at the performance index time series of a filtered data session, then the filter would be considered to have a negative impact on the data recording.

As seen in Figure 10, a steady state error occurs even before the filtering process is performed. This is expected to happen due to the use of 3D printing technology to manufacture the markers frames. However, as long as this steady state error remains constant, the markers position given by the VICON can be considered to behave as rigid bodies, which is the recording experiment objective.

Also, from the first set of experiments, many variations over the sampling frequency were applied in order to establish a proper frequency range. Moreover, the subject under recording was required to perform some typical motions without using any periodic control and using the full VICON workspace volume available at the laboratory (close to 6 m3). The resulting performance indexes for this first iteration were not reliable given the high error peaks. This resulted as well in a second recording iteration where VICON cameras were reallocated to create a smaller workspace (close to 4 m3) and experimental parameters were controlled as shown in Table 1.

A clear delimited intersection was observed during the bisectors visualization process for every recorded session, resulting in an effective error localization tool. In addition to this, it has been identified that the bisectors showed an errant behavior close to the configurations where the knee was at the full extension and flexion points, namely, when the angular velocity is close to zero and the displacement is about to change direction. Figure 11 shows an example of a frame where the movement process lies on this region.

Furthermore, note that, from the complete set of 12 markers used at the VICON recording process, only 7 were used for the complete process. As previously discussed, markers , , and were selected out of the five thighs’ markers ( to ) to allocate and reorient their common plane to the global reference frame. Moreover, the markers selection process developed for the calf’s markers fixture resulted in the fact that the best set of markers to be considered rigid body’s particles were markers to .

Figures 12 and 13 show the well-delimited trajectories that occurred in different experimental sessions, presenting repetitive patterns that were observed before the application of the complete geometric approach process towards the determination of the ICR.

Although the markers fixtures were not mounted on the exact location on the subject’s leg for the different recorded sessions, the results indicate that the ICR trajectories were commonly placed on the same region relative to the thigh frame. This confirmed that the area that was under observation corresponded to the knee joint area that consistently presented similar magnitudes for every session, confirming that the ICR of the knee joint has important displacements in this area.

The resulting contours for every session were plotted showing a normal ICR displacement. Nevertheless, some sessions did not present defined delimited trajectories in spite of having similar angle values and movements in similar regions. Thus, these particular inequalities are attributed to recording errors for those specific sessions.

Furthermore, Figures 12 and 13 show the plotted contours that resulted from the experimental work, where a total of 12 swing movements (from flexion to extension or vice versa) are displayed for a common range of values from 75° to 125° (plotted movement of the leg swing).

After obtaining the contours with an embedded value of for any point on the curve and considering that the ranges of values are similar, it is possible to develop a mechanical design by placing a fixed point followed by the determination of every value from the proposed kinematic model.

The trajectory visualization is performed once the complete data set of ICR points for every frame and their paired angle are acquired in a single 2D array. Note however that having a sole visualization of the ICR trajectories and displacements does not bring useful data for design purposes. Figure 14 shows the resulting contour on the plane that has been obtained using the equivalent model proposed in this paper.

6. Conclusions

This paper proposes the use of commercial vision systems in order to determine the knee joint geometrical design for robotic exoskeletons. Given the fact that most of the devices found in the literature are designed by considering the human joints as single-noninvariant rotational joints, this paper proposes a kinematic model based on irregular shaped cams as the joint mechanism for emulating the bone-to-bone joints in the human body. The paper proposes a geometric approach for determining the ICR location in order to design those cam contours. The implementation of markers fixtures where rigid frames were mounted over a subject thigh and calf showed acceptable results. Reliable rigid body-like data was obtained, presenting an average squared error of 4.51 mm2, which could be attributed to manufacturing tolerances, where the standard deviation resulted in 1.7 mm2. The vision systems provided a reliable measurement tool for motion tracking, presenting considerable improvements when minor changes were made at the vision system environment (up to 53.58% squared error reduction compared with the first iteration). The human knee joint consistently showed important ICR displacements over an average area of 23.5 mm × 34.8 mm over the - and -axes, respectively. The geometric approach to the ICR position for every movement frame showed consistent results even when a total of six bisectors intersections were identified. The resulting data support that the proposed kinematic model of contacting cams with a serial link connection of rotational-prismatic-rotational joints fits efficiently to the real human knee behavior. Future work will focus on applying this model for other single DoF joints, for example, elbow; looking for ICR displacements in more than 1 plane for the knee joint (sagittal plane for this paper work); and extending the determination of the equivalent models that are suitable for higher DoF joints, for example, shoulder and hip. Furthermore, an exoskeleton prototype will be constructed using the model and techniques presented in this paper.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

This work has been supported by Consejo Nacional de Ciencia y Tecnologia (CONACYT) and the National Robotics Laboratory at Tecnólogico de Monterrey.