Table of Contents Author Guidelines Submit a Manuscript
BioMed Research International
Volume 2018, Article ID 7816160, 8 pages
https://doi.org/10.1155/2018/7816160
Research Article

The Dimensionless Squared Jerk: An Objective Parameter That Improves Assessment of Hand Motion Analysis during Simulated Shoulder Arthroscopy

1Department of Orthopedic Surgery, St. Carolus Hospital, Jakarta, Indonesia
2Department of Orthopedic Surgery, Asan Medical Center, College of Medicine, University of Ulsan, Seoul, Republic of Korea
3Upper Limb Department, Robert Jones & Agnes Hunt Hospital, Oswestry, England, UK
4Department of Robotics Engineering, Daegu Gyeongbuk Institute of Science and Technology, Daegu, Republic of Korea

Correspondence should be addressed to In-Ho Jeon; moc.liamg@iohcnoej

Received 14 March 2018; Revised 4 June 2018; Accepted 19 June 2018; Published 11 July 2018

Academic Editor: Berardo Di Matteo

Copyright © 2018 Erica Kholinne et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Purpose. Attempts to quantify hand movements of surgeons during arthroscopic surgery faced limited progress beyond motion analysis of hands and/or instruments. Surrogate markers such as procedure time have been used. The dimensionless squared jerk (DSJ) is a measure of deliberate hand movements. This study tests the ability of DSJ to differentiate novice and expert surgeons (construct validity) whilst performing simulated arthroscopic shoulder surgical tasks. Methods. Six residents (novice group) and six consultants (expert group) participated in this study. Participants performed three validated tasks sequentially under the same experimental setup (one performance). Each participant had ten performances assessed. Hand movements were recorded with optical tracking system. The DSJ, time taken, total path length, multiple measures of acceleration, and number of movements were recorded. Results. There were significant differences between novices and experts when assessed using time, number of movements with average and minimal acceleration threshold, and DSJ. No significant differences were observed in maximum acceleration, total path length, and number of movements with 10m/s2 acceleration threshold. Conclusion. DSJ is an objective parameter that can differentiate novice and expert surgeons’ simulated arthroscopic performances. We propose DSJ as an adjunct to more conventional parameters for arthroscopic surgery skills assessment.

1. Introduction

There is currently no accepted definition of arthroscopic skills competency or proficiency [1]. This makes it difficult for training institutions to set skills assessments for competency-based training [2, 3]. Broadly these assessments can be categorized as being subjective, objective, or assumption of competence by numbers.

Subjective assessment is the simplest and earliest form of assessment. It follows similar principles to an apprenticeship, where a trainer will give their trainee or apprentice a global assessment [4]. It has been shown that this form of assessment does not reflect the actual level of skill the trainee may possess [2, 5].

To improve the assessment, more objectively based assessment tools have been developed [1, 611], whilst remaining feasible and practical [12, 13]. Objective assessment tools described can be broadly defined into quantifiable outcome measurement (such as mean time to perform the task, force measurements, and motion analysis [12, 1420] or procedural checklists/global ratings scores (GRS) [7, 2125] (categorical subjective assessment of defined intraprocedural steps).

When a new skill is being learned, a learning curve can be plotted and maintained at a plateau if a skill is continuously practiced [26, 27]. An individual plateau point does not define competence, but it does assume that most novices should achieve the same skills performance plateau of experienced surgeons with continued practice [26, 27].

In shoulder arthroscopy skills evaluation, outcome measures that were able to discriminate skill level on simulators include time to completion of tasks, distance and path traveled by probe, and number of probe collisions [2831].

Number of movements is difficult to define. It can be described as the number of deliberate movements above a threshold acceleration value. One study considered an arbitrary value of 10 m/s2 as the threshold value to detect deliberate hand movements of the surgeon. [12] Another study considered the minimum acceleration value from each participant as the threshold value to detect the deliberate hand movement [16]. Hence, the lack of clarity on the optimal criteria to determine the number of movements is a key limitation of use of this parameter for skill assessment.

Limited progress has been made to quantify hand or instrument movement beyond motion analysis using the parameters above. The dimensionless squared jerk (DSJ) was designed to be less dependent on time and to place more emphasis on movement. In physics, jerk is defined as a rate of change for acceleration. Therefore, it is a derivative of acceleration with respect to time and distance and as such is the second derivative of velocity or the third derivative of position. Hogan and Sternad noted that jerk could well-quantify the smoothness of motion related to hand coordination, with superior thoroughness; as such the sensitivity needed to be dimensionless, so that there would be no natural dependency of movement duration, extent, and spurious peaks [4].

The dimensionless squared jerk (DSJ) has been accepted as an objective parameter to quantify hand motion in different disciplines, such as parkinsonism, kinetics, and optometry. [3235] To date, there is no study in the published literature quantifying hand motion using DSJ during simulated arthroscopic surgery.

In this study, we compare the ability of conventional parameters (procedural time, total path length, multiple measures of acceleration, and number of movements) to differentiate between novices and experts performing simulated shoulder arthroscopic tasks. To improve objective assessment of arthroscopic performance, we evaluated the construct validity of DSJ. Our hypothesis is that DSJ can differentiate between novices and experts performing simulated shoulder arthroscopic tasks and can be used as a parameter to train and assess surgeons.

2. Methods

2.1. Participants

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards. Institutional Review Board was obtained from Asan Medical Center prior to study (no. 2017-0292). Informed consent was obtained from all individual participants included in the study. The two groups in this study were the novice group (no hands-on experience of arthroscopic surgery) and the expert group (shoulder arthroscopy consultant). A test study was performed to calculate the expected means of the performance parameters in both groups. A priori power analysis showed a minimum of 51 attempts in each group would be sufficiently powered (80%) at a significance level of 0.05. Twelve volunteers participated in this study. These included six residents (novice group) and six consultants (expert group). Each participant performed the simulated arthroscopic tasks ten times. All the participants were right handed; therefore, they controlled the arthroscope with the left hand and maneuvered the surgical instruments with their right hand.

2.2. Experiment Setup and Protocol

Each participant performed three simulated arthroscopic tasks with a standard 30° angle arthroscope with 105° field of view (Conmed–Linvatec Corporation, Largo, FL). Both groups performed the experiment under the same experimental protocols and design and were evaluated with an optical tracking system. The phantom model, arthroscope, and surgical instruments were arranged in accordance with their predesigned places on the preparation table (Figure 1(a)).

Figure 1: Experimental setup. (a) Standardized preparation tool (red dash lines). (b) Motion capture system configuration. Yellow dots are reflective markers. Blue outlined prisms are optical cameras.

The optical tracking system consisted of seven large-volume-motion-capture cameras (Prime 41; Natural Point, Inc., Corvallis, OR, USA). These were organized in a circular order to ensure their ability to capture two reflective markers. The markers were attached on the dorsal aspect of hands of the participants, at the mid-shaft point of the third metacarpal (Figure 1(b)). Marker trajectory was recorded with an associated tracking software (Motive: Tracker; Natural Point, Inc.) at a sampling rate of 120 Hz.

A human shoulder phantom model (Arthrex Inc., Naples, FL, USA) was modified for shoulder arthroscopic simulation purposes. Five black silks were sutured at five different predetermined sites along the torn lateral border of the simulated rotator cuff (Figure 2).

Figure 2: Modified human phantom shoulder model. Red-dots circle is a predesigned suture anchor site and red arrows are five predetermined points along the lateral border of rotator cuff.

All participants gave consent and were briefed about the experimental shoulder arthroscopic tasks. Each participant performed three validated shoulder arthroscopic tasks sequentially (Video 1) [2, 16]. Video is available as supplementary material. First, each participant touched five points along the rotator cuff with a grasper. Second, each participant inserted an anchor at a predetermined hole on the footprint of the rotator cuff on the greater tuberosity. Third, each participant pulled sutures through the anterior portal with grasper.

Participants placed their hand in a predetermined location on the preparation table before the start of the task and replaced to the initial position after completion of the task.

The optical tracking system was utilized to record three-dimensional (3D) movement of hand. Data were recorded from the time when the hands were off the preparation tool up to the time when all the surgical instruments were replaced to their initial locations for each task. All data generated or analyzed during this study are included within this article.

2.3. Data Collection and Analysis

The optical tracking system recorded the 3D position data (x, y, z) of each marker as a function of time. Total procedural time was calculated by adding the time for the three tasks cumulatively, without any intertask time. The total path length was defined as the distance traveled by the participant’s hands during the three tasks (without any intertask distance). The acceleration parameter was analyzed in two ways: 1, computation of average acceleration; 2, computation of the maximum acceleration. The number of movements was defined by changes in velocity with respect to time, according to three threshold values: 1, acceleration above 10 m/s2; 2, acceleration above minimum acceleration of each participant; 3, acceleration above average acceleration of each participant.

Each participant’s total DSJ, which was the jerk without dimensions, was calculated with the following formula [4, 36, 37]:where is squared jerk, is the movement’s duration, and is the movement’s average velocity which was calculated using the 3D position data. The formula for calculating the DSJ was chosen based on the previous study by Hogan and Stenard [4]. We confine our observation to the earlier measure which has been used in the previous studies. [36, 37]

A Wilcoxon-Mann-Whitney Test was performed to analyze the differences in each parameter. Level of significance was set at 0.05 (p value).

3. Results

There were highly significant differences (p < 0.001) between novices and experts when assessed using time, number of movements (minimum acceleration, average acceleration), and DSJ. A significant difference was observed in average acceleration (p = 0.050) and range of acceleration (p = 0.046). No significant difference was observed in number of movements (10 m/s2) (p = 0.371), maximum acceleration (p = 0.545), or total path length (p = 0.395) (Table 1). Master data table is available as supplementary material. The main result of this study has been presented at the 27th SECEC-ESSSE Congress, held in Berlin (Germany), September 13-16, 2017. [38]

Table 1: The mean of the 10 performances stratified by participant and group (novice versus consultant). Wilcoxon-Mann-Whitney Test p values between groups. Min. acc.: minimum acceleration; Avg. acc.: average acceleration; Nov: novice. = statistically highly significant. = statistically significant.

Consultants were significantly quicker to complete all tasks and had a faster average and range of acceleration. They also had a highly significant lower DSJ indicating that consultants had less unwanted and more purposeful movements than novices. Novices had significantly more number of movements when the threshold was defined using minimum and average acceleration. Using 10m/s2 as the acceleration threshold to define a movement [16], there was a tendency that novices had more movements, but this did not reach significance (Figures 3(a), 3(b), 3(c), and 3(d)).

Figure 3: Whisker plots for (a) time, (b) dimensionless squared jerk, (c) number of movements (minimum acceleration), and (d) number of movements (average acceleration).

4. Discussion

We have shown that DSJ as an objective parameter achieves construct validity in differentiating between novices and experts performing simulated shoulder arthroscopic tasks. This is logical as other studies have shown motion analysis to be a valid assessment tool in determining skill level. [12, 3942]

Our results show that experts performed simulated arthroscopic tasks faster than novices in keeping with other studies [12, 43]. However, we would not draw the conclusion that the fastest surgeons are the best surgeons as procedural time alone does not give information on movement control or quantify risk of unnecessary iatrogenic injury. We did see a significantly larger range of acceleration in the experts compared to the novices whilst other studies reported the experts proceeded with a higher velocity [12, 28, 41, 42, 44].

Number of movements may help quantify risk of iatrogenic injury. In this study, we observed novices consistently had more hand movements. This is in keeping with other studies which also showed the novice usually will demonstrate unnecessary hand movements compared to experts [41, 44].

Using number of movements as an objective parameter to measure arthroscopic performance is difficult. This is partly because of spurious peaks that may occur because of two or more submovements [45, 46] and partly due to there being no consensus on the definition of a purposeful movement. We have evaluated three methods of defining a purposeful movement and support Jung et al. definition of a purposeful movement: one that results in an acceleration that exceeds the threshold set by the minimum acceleration of the participant [16]. This achieved statistical significance along with a purposeful movement being defined as an acceleration that exceeds average acceleration of the participant.

Defining a purposeful movement can be avoided by using DSJ as it is a parameter that is based on rate of acceleration and thus is independent of any threshold value. This allows DSJ to provide a measure of deliberate hand movements only by taking cognizance of the changes in acceleration (jerk) and the area under the jerk’s curve to eliminate the potential bias induced by spurious peaks [4, 45, 46].

Another parameter that has been used to differentiate skill level of surgeons for arthroscopy is total path length [12, 43]. In this study, the expert group hand motion tracked over a longer total path length as compared to that in the novice group. We postulate that longer total path length may reflect the care taken to avoid iatrogenic damage to intra-articular structures in this anatomical simulated arthroscopic study. The shortest path between 2 points is a straight line, but to avoid collision, a longer nonlinear path may have been adopted by the experts. Further studies that incorporate collision data alongside motion analysis would help investigate this further.

4.1. Limitations

We accept that this study has several limitations. First, the number of participants in each group was low. Second, we only considered three shoulder arthroscopy tasks for this analysis, whilst there are numerous techniques and skills utilized during arthroscopic surgery. Future studies need to expand the number of standardized arthroscopic tasks, as we could not assess the novices’ ability to perform complicated tasks without some training, thereby interfering with the results. Third, the hand motions were represented only by two markers. Such simplified motions cannot analyze the wrist, forearm, or elbow motion. Fourth, the assumption is that the simulated arthroscopic tasks correlate with intraoperative performance. Fifth, there is no clear definition of an expert arthroscopist. Lastly, we acknowledged that there were several methods proposed in quantifying smoothness of movement. Nevertheless, we felt that jerk-based measurement proposed by previous study by Hogan et al. would meet our study’s purposes [4].

5. Conclusion

DSJ is an objective parameter that can differentiate experts and novices at simulated shoulder arthroscopy. Modern day training requires objective skills assessment to support competency-based curricula, and the DSJ can function as a useful objective performance parameter to measure deliberate movements alongside other motion analysis parameters.

Data Availability

The data used to support the findings of this study are available from the corresponding author upon request.

Disclosure

Level of evidence is diagnostic study, Level III.

Conflicts of Interest

The authors declare that there are no conflicts of interest.

Acknowledgments

This study was supported by the Global Frontier R&D Program on Human-Centered Interaction for Coexistence. This study was funded by National Research Foundation of Korea via the Korean Government (MSIP) (grant no. 2017-0522).

Supplementary Materials

Video 1 in Methods and Master data table in Results are available as supplementary materials. Video shows three standardized arthroscopic tasks for each participant to complete. The performances data by participant and group (novice versus consultant). Wilcoxon-Mann-Whitney Test p values between groups. (Supplementary Materials)

References

  1. J. L. Hodgins and C. Veillette, “Arthroscopic proficiency: methods in evaluating competency,” BMC Medical Education, vol. 13, no. 1, 2013. View at Publisher · View at Google Scholar
  2. A. Darzi, S. Smith, and N. Taffinder, “Assessing operative skill. Needs to become more objective,” BMJ, vol. 318, no. 7188, pp. 887-888, 1999. View at Publisher · View at Google Scholar
  3. J. A. Milburn, G. Khera, S. T. Hornby, P. S. Malone, and J. E. Fitzgerald, “Introduction, availability and role of simulation in surgical education and training: review of current evidence and recommendations from the Association of Surgeons in Training,” International Journal of Surgery, vol. 10, no. 8, pp. 393–398, 2012. View at Publisher · View at Google Scholar · View at Scopus
  4. N. Hogan and D. Sternad, “Sensitivity of Smoothness Measures to Movement Duration, Amplitude, and Arrests,” Journal of Motor Behavior, vol. 41, no. 6, pp. 529–534, 2009. View at Publisher · View at Google Scholar
  5. J. D. Mabrey, S. D. Gillogly, J. R. Kasser et al., “Virtual reality simulation of arthroscopy of the knee.,” Arthroscopy : the journal of arthroscopic & related surgery : official publication of the Arthroscopy Association of North America and the International Arthroscopy Association, vol. 18, no. 6, p. E28, 2002. View at Publisher · View at Google Scholar · View at Scopus
  6. A. Alvand, K. Logishetty, R. Middleton, T. Khan, W. F. Jackson, A. J. Price et al., “Validating a global rating scale to monitor individual resident learning curves during arthroscopic knee meniscal repair,” Arthroscopy: The Journal of Arthroscopic & Related Surgery, vol. 29, no. 5, pp. 906–912, 2013. View at Google Scholar
  7. N. R. Howells, H. S. Gill, A. J. Carr, A. J. Price, and J. L. Rees, “Transferring simulated arthroscopic skills to the operating theatre,” The Journal of Bone & Joint Surgery, vol. 90-B, no. 4, pp. 494–499, 2008. View at Publisher · View at Google Scholar
  8. A. Insel, B. Carofino, R. Leger, R. Arciero, and A. D. Mazzocca, “The development of an objective model to assess arthroscopic performance,” The Journal of Bone & Joint Surgery—American Volume, vol. 91, no. 9, pp. 2287–2295, 2009. View at Publisher · View at Google Scholar · View at Scopus
  9. R. Koehler, T. John, J. Lawler, C. Moorman, and G. Nicandri, “Arthroscopic training resources in orthopedic resident education,” The Journal of Knee Surgery, vol. 28, no. 1, pp. 67–74, 2015. View at Publisher · View at Google Scholar · View at Scopus
  10. J. A. Martin, G. Regehr, R. Reznick et al., “Objective structured assessment of technical skill (OSATS) for surgical residents,” British Journal of Surgery, vol. 84, no. 2, pp. 273–278, 1997. View at Publisher · View at Google Scholar · View at Scopus
  11. R. M. Middleton, M. J. Baldwin, K. Akhtar, A. Alvand, and J. L. Rees, “Which global rating scale?: A comparison of the ASSET, BAKSSS, and IGARS for the assessment of simulated arthroscopic skills,” Journal of Bone and Joint Surgery - American Volume, vol. 98, no. 1, pp. 75–81, 2016. View at Publisher · View at Google Scholar · View at Scopus
  12. N. R. Howells, M. D. Brinsden, R. S. Gill, A. J. Carr, and J. L. Rees, “Motion analysis: a validated method for showing skill levels in arthroscopy,” Arthroscopy: The Journal of Arthroscopic & Related Surgery, vol. 24, no. 3, pp. 335–342, 2008. View at Google Scholar
  13. R. S. Sidhu, E. D. Grober, L. J. Musselman, and R. K. Reznick, “Assessing competency in surgery: Where to begin?” Surgery, vol. 135, no. 1, pp. 6–20, 2004. View at Publisher · View at Google Scholar · View at Scopus
  14. A. Alvand, S. Auplish, H. Gill, and J. Rees, “Innate arthroscopic skills in medical students and variation in learning curves,” The Journal of Bone & Joint Surgery, vol. 93, no. 19, p. e115.9, 2011. View at Publisher · View at Google Scholar · View at Scopus
  15. H. Fatima, D. K. Rex, R. Rothstein et al., “Cecal Insertion and Withdrawal Times With Wide-Angle Versus Standard Colonoscopes: A Randomized Controlled Trial,” Clinical Gastroenterology and Hepatology, vol. 6, no. 1, pp. 109–114, 2008. View at Publisher · View at Google Scholar · View at Scopus
  16. K. Jung, D. Kang, A. L. Kekatpure, A. Adikrishna, J. Hong, and I. Jeon, “A new wide-angle arthroscopic system: a comparative study with a conventional 30° arthroscopic system,” Knee Surgery, Sports Traumatology, Arthroscopy, vol. 24, no. 5, pp. 1722–1729, 2016. View at Publisher · View at Google Scholar
  17. K. Kim, D. Kim, K. Matsumiya, E. Kobayashi, and T. Dohi, “Wide FOV Wedge Prism Endoscope,” in Proceedings of the 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference, pp. 5758–5761, Shanghai, China, January 2006. View at Publisher · View at Google Scholar
  18. S.-i. Kim and T. S. Suh, “World Congress on Medical Physics and Biomedical Engineering 2006,” in Proccedings of the Imaging The Future Medicine, Seoul, Korea, 2006.
  19. M. Pellisé, G. Fernández–Esparrach, A. Cárdenas et al., “Impact of Wide-Angle, High-Definition Endoscopy in the Diagnosis of Colorectal Neoplasia: A Randomized Controlled Trial,” Gastroenterology, vol. 135, no. 4, pp. 1062–1068, 2008. View at Publisher · View at Google Scholar
  20. D. K. Rex, V. Chadalawada, and D. J. Helper, “Wide angle colonoscopy with a prototype instrument: Impact on miss rates and efficiency as determined by back-to-back colonoscopies,” American Journal of Gastroenterology, vol. 98, no. 9, pp. 2000–2005, 2003. View at Publisher · View at Google Scholar · View at Scopus
  21. R. Aggarwal, T. Grantcharov, K. Moorthy, T. Milland, and A. Darzi, “Toward feasible, valid, and reliable video-based assessments of technical surgical skills in the operating room,” Annals of Surgery, vol. 247, no. 2, pp. 372–379, 2008. View at Publisher · View at Google Scholar · View at Scopus
  22. J. D. Doyle, E. M. Webber, and R. S. Sidhu, “A universal global rating scale for the evaluation of technical skills in the operating room,” The American Journal of Surgery, vol. 193, no. 5, pp. 551–555, 2007. View at Publisher · View at Google Scholar · View at Scopus
  23. T. R. Eubanks, R. H. Clements, D. Pohl et al., “An objective scoring system for laparoscopic cholecystectomy,” Journal of the American College of Surgeons, vol. 189, no. 6, pp. 566–574, 1999. View at Publisher · View at Google Scholar · View at Scopus
  24. M. C. Vassiliou, L. S. Feldman, C. G. Andrew et al., “A global assessment tool for evaluation of intraoperative laparoscopic skills,” The American Journal of Surgery, vol. 190, no. 1, pp. 107–113, 2005. View at Publisher · View at Google Scholar
  25. M. C. Vassiliou, P. A. Kaneva, B. K. Poulose et al., “Global assessment of gastrointestinal endoscopic skills (GAGES): A valid measurement tool for technical skills in flexible endoscopy,” Surgical Endoscopy, vol. 24, no. 8, pp. 1834–1841, 2010. View at Publisher · View at Google Scholar · View at Scopus
  26. N. R. Howells, S. Auplish, G. C. Hand, H. S. Gill, A. J. Carr, and J. L. Rees, “Retention of arthroscopic shoulder skills learned with use of a simulator: Demonstration of a learning curve and loss of performance level after a time delay,” The Journal of Bone & Joint Surgery, vol. 91, no. 5, pp. 1207–1213, 2009. View at Publisher · View at Google Scholar · View at Scopus
  27. W. M. Jackson, T. Khan, A. Alvand et al., “Learning and Retaining Simulated Arthroscopic Meniscal Repair Skills,” The Journal of Bone and Joint Surgery-American Volume, vol. 94, no. 17, pp. e132–1-8, 2012. View at Publisher · View at Google Scholar
  28. A. H. Gomoll, R. V. O'Toole, J. Czarnecki, and J. J. P. Warner, “Surgical experience correlates with performance on a virtual reality simulator for shoulder arthroscopy,” The American Journal of Sports Medicine, vol. 35, no. 6, pp. 883–888, 2007. View at Publisher · View at Google Scholar · View at Scopus
  29. K. D. Martin, P. J. Belmont, A. J. Schoenfeld, M. Todd, K. L. Cameron, and B. D. Owens, “Arthroscopic Basic Task Performance in Shoulder Simulator Model Correlates with Similar Task Performance in Cadavers,” The Journal of Bone and Joint Surgery-American Volume, vol. 93, no. 21, pp. e127(1)–e127(5), 2011. View at Publisher · View at Google Scholar
  30. R. A. Pedowitz, J. Esch, and S. Snyder, “Evaluation of a virtual reality simulator for arthroscopy skills development,” Arthroscopy: The Journal of Arthroscopic & Related Surgery, vol. 18, no. 6, pp. E29–E39, 2002. View at Publisher · View at Google Scholar · View at Scopus
  31. S. Smith, A. Wan, N. Taffmder, S. Read, R. Emery, and A. Darzi, “Early experience and validation work with procedicus va - The prosolvia virtual reality shoulder arthroscopy trainer,” Studies in Health Technology and Informatics, vol. 62, pp. 337–343, 1999. View at Publisher · View at Google Scholar · View at Scopus
  32. T. Flash and N. Hogan, “The coordination of arm movements: an experimentally confirmed mathematical model,” The Journal of Neuroscience, vol. 5, no. 7, pp. 1688–1703, 1985. View at Publisher · View at Google Scholar
  33. C. J. Ketcham, R. D. Seidler, A. W. Van Gemmert, and G. E. Stelmach, “Age-related kinematic differences as influenced by task difficulty, target size, and movement amplitude,” The Journals of Gerontology Series B: Psychological Sciences and Social Sciences, vol. 57, no. 1, pp. P54–P64, 2002. View at Publisher · View at Google Scholar
  34. H. L. Teulings, J. L. Contreras-Vidal, G. E. Stelmach, and C. H. Adler, “Parkinsonism reduces coordination of fingers, wrist, and arm in fine motor control,” Experimental Neurology, vol. 146, no. 1, pp. 159–170, 1997. View at Publisher · View at Google Scholar · View at Scopus
  35. H. J. Wyatt, “Detecting saccades with jerk,” Vision Research, vol. 38, no. 14, pp. 2147–2153, 1998. View at Publisher · View at Google Scholar · View at Scopus
  36. K. Takada, K. Yashiro, and M. Takagi, “Reliability and sensitivity of jerk-cost measurement for evaluating irregularity of chewing jaw movements,” Physiological Measurement, vol. 27, no. 7, article no. 005, pp. 609–622, 2006. View at Publisher · View at Google Scholar · View at Scopus
  37. K. Yashiro, T. Nakamura, T. Mizumori, H. Yatani, and K. Takada, “Clinical validity of measuring jerk-cost of jaw movement during speech: Effect of mouthguard design on smoothness of jaw movements,” in Proceedings of the SICE Annual Conference 2004, pp. 2741–2744, Sapporo, Japan, August 2004. View at Scopus
  38. E. A. A. Kholinne, M. J. Gandhi, H. P. Hong, and I. H. Jeon, “The Dimensionless Squarred Jolt (DSJ) - A novel objective parameter that improves assessment of hand motion analysis during shoulder arthroscopy,” in Proceedings of the 27th SECEC-ESSSE Congress, Berlin, Germany, 2017.
  39. V. Datta, A. Chang, S. Mackay, and A. Darzi, “The relationship between motion analysis and surgical technical assessments,” The American Journal of Surgery, vol. 184, no. 1, pp. 70–73, 2002. View at Publisher · View at Google Scholar · View at Scopus
  40. J. D. Mason, J. Ansell, N. Warren, and J. Torkington, “Is motion analysis a valid tool for assessing laparoscopic skill?” Surgical Endoscopy, vol. 27, no. 5, pp. 1468–1477, 2013. View at Publisher · View at Google Scholar · View at Scopus
  41. A. Nasr, B. Carrillo, J. T. Gerstle, and G. Azzie, “Motion analysis in the pediatric laparoscopic surgery (PLS) simulator: Validation and potential use in teaching and assessing surgical skills,” Journal of Pediatric Surgery, vol. 49, no. 5, pp. 791–794, 2014. View at Publisher · View at Google Scholar · View at Scopus
  42. Y. Tashiro, H. Miura, Y. Nakanishi, K. Okazaki, and Y. Iwamoto, “Evaluation of Skills in Arthroscopic Training Based on Trajectory and Force Data,” Clinical Orthopaedics and Related Research, vol. 467, no. 2, pp. 546–552, 2009. View at Publisher · View at Google Scholar
  43. G. S. J. Kirby, P. Guyver, L. Strickland et al., “Assessing arthroscopic skills using wireless elbow-worn motion sensors,” Journal of Bone and Joint Surgery - American Volume, vol. 97, no. 13, pp. 1119–1127, 2014. View at Publisher · View at Google Scholar · View at Scopus
  44. V. Datta, S. Mackay, M. Mandalia, and A. Darzi, “The use of electromagnetic motion tracking analysis to objectively measure open surgical skill in the laboratory-based model,” Journal of the American College of Surgeons, vol. 193, no. 5, pp. 479–485, 2001. View at Publisher · View at Google Scholar · View at Scopus
  45. M. Drakesmith, K. Caeyenberghs, A. Dutt, G. Lewis, A. S. David, and D. K. Jones, “Overcoming the effects of false positives and threshold bias in graph theoretical analyses of neuroimaging data,” NeuroImage, vol. 118, pp. 313–333, 2015. View at Publisher · View at Google Scholar · View at Scopus
  46. B. Rohrer and N. Hogan, “Avoiding spurious submovement decompositions II: A scattershot algorithm,” Biological Cybernetics, vol. 94, no. 5, pp. 409–414, 2006. View at Publisher · View at Google Scholar · View at Scopus