Research Papers

Digitalization of Human Operations in the Age of Cyber Manufacturing: Sensorimotor Analysis of Manual Grinding Performance

[+] Author and Article Information
Gregory L. Bales

Mechanical and Aerospace Engineering,
University of California,
Davis, CA 95616
e-mail: glbales@ucdavis.edu

Jayanti Das

Mechanical and Aerospace Engineering,
University of California,
Davis, CA 95616
e-mail: jydas@ucdavis.edu

Jason Tsugawa

Mechanical and Aerospace Engineering,
University of California,
Davis, CA 95616
e-mail: jztsugawa@ucdavis.edu

Barbara Linke

Mechanical and Aerospace Engineering,
University of California,
Davis, CA 95616
e-mail: bslinke@ucdavis.edu

Zhaodan Kong

Mechanical and Aerospace Engineering,
University of California,
Davis, CA 95616
e-mail: zdkong@ucdavis.edu

Manuscript received February 3, 2017; final manuscript received August 13, 2017; published online September 1, 2017. Assoc. Editor: Ivan Selesnick.

J. Manuf. Sci. Eng 139(10), 101011 (Sep 01, 2017) (8 pages) Paper No: MANU-17-1071; doi: 10.1115/1.4037615 History: Received February 03, 2017; Revised August 13, 2017

This paper presents new techniques to analyze and understand the sensorimotor characteristics of manual operations such as grinding, and links their influence on process performance. A grinding task, though simple, requires the practitioner to combine elements from the large repertoire of his or her skillset. Based on the joint gaze, force, and velocity data collected from a series of manual grinding experiments, we have compared operators with different levels of experience and quantitatively described characteristics of human manual skill and their effects on manufacturing process parameters such as cutting energy, surface finish, and material removal rate (MRR). For instance, we find that an experienced subject performs the task in a precise manner by moving the tool in complex paths, with lower applied forces and velocities, and short fixations compared to a novice. A detailed understanding of gaze-motor behavior broadens our knowledge of how a manual task is executed. Our results help to provide this extra insight, and impact future efforts in workforce training as well as the digitalization of manual expertise, thereby facilitating the transformation of raw data into product-specific knowledge.

Copyright © 2017 by ASME
Your Session has timed out. Please sign back in to continue.


Goldstein, E. , 2014, Cognitive Psychology: Connecting Mind, Research and Everyday Experience, Cengage Learning, Boston, MA.
Wojtara, T. , Uchihara, M. , Murayama, H. , Shingo, S. , Sakai, S. , Fujimoto, H. , and Kimura, H. , 2009, “ Human–Robot Collaboration in Precise Positioning of a Three-Dimensional Object,” Automatica, 45(2), pp. 333–342. [CrossRef]
Francalanza, E. , Borg, J. , and Constantinescu, C. , 2017, “ Development and Evaluation of a Knowledge-Based Decision-Making Approach for Designing Changeable Manufacturing Systems,” CIRP J. Manuf. Sci. Technol., 16, pp. 81–101. [CrossRef]
Lee, J. , Bagheri, B. , and Kao, H.-A. , 2015, “ A Cyber-Physical Systems Architecture for Industry 4.0-Based Manufacturing Systems,” Manuf. Lett., 3, pp. 18–23. [CrossRef]
Wang, L. , Törngren, M. , and Onori, M. , 2015, “ Current Status and Advancement of Cyber-Physical Systems in Manufacturing,” J. Manuf. Syst., 37(Part 2), pp. 517–527. [CrossRef]
Das, J. , and Linke, B. , 2016, “ Effect of Manual Grinding Operations on Surface Integrity,” Procedia CIRP, 45, pp. 95–98. [CrossRef]
Khellouki, A. , Rech, J. , and Zahouani, H. , 2007, “ The Effect of Abrasive Grain's Wear and Contact Conditions on Surface Texture in Belt Finishing,” Wear, 263(1–6), pp. 81–87. [CrossRef]
Mezghani, S. , and El Mansori, M. , 2008, “ Abrasiveness Properties Assessment of Coated Abrasives for Precision Belt Grinding,” Surf. Coat. Technol., 203(5–7), pp. 786–789. [CrossRef]
Odum, K. , Castillo, M. C. , Das, J. , and Linke, B. , 2014, “ Sustainability Analysis of Grinding With Power Tools,” Proc. CIRP, 14, pp. 570–574. [CrossRef]
Das, J. , and Linke, B. , 2017, “ Evaluation and Systematic Selection of Significant Multi-Scale Surface Roughness Parameters (SRPs) as Process Monitoring Index,” J. Mater. Process. Technol., Procedia Manuf., 244, pp. 157–165. [CrossRef]
Erez, T. , Tramper, J. J. , Smart, W. D. , and Gielen, S. C. A. M. , 2011, “ A POMDP Model of Eye-Hand Coordination,” Twenty-Fifth AAAI Conference on Artificial Intelligence, San Francisco, CA, Aug. 7–11, pp. 952–957. https://pdfs.semanticscholar.org/d1c7/fb4de8c2f7e74aae96db10ddd2b1045af15a.pdf
Land, M. F. , 2009, “ Vision, Eye Movements, and Natural Behavior,” Visual Neurosci., 26(1), pp. 51–62. [CrossRef]
Yi, W. , and Ballard, D. , 2009, “ Recognizing Behavior in Hand-Eye Coordination Patterns,” Int. J. Hum. Rob., 6(3), pp. 337–359. [CrossRef]
Sprague, N. , Ballard, D. , and Robinson, A. , 2007, “ Modeling Embodied Visual Behaviors,” ACM Trans. Appl. Percept. (TAP), 4(2), p. 11. [CrossRef]
Kong, Z. , and Mettler, B. , “ Modeling Human Guidance Behavior Based on Patterns in Agent–Environment Interactions,” IEEE Trans. Hum.-Mach. Syst., 43(4), pp. 371–384. [CrossRef]
Mettler, B. , Kong, Z. , Li, B. , and Andersh, J. , 2014, “ Systems View on Spatial Planning and Perception Based on Invariants in Agent-Environment Dynamics,” Front. Neurosci., 8, p. 439. [PubMed]
Bales, G. , Das, J. , Linke, B. , and Kong, Z. , 2016, “ Recognizing Gaze-Motor Behavioral Patterns in Manual Grinding Tasks,” Procedia Manuf., 5, pp. 106–121. [CrossRef]
Lukander, K. , 2003, “ Mobile Usability-Measuring Gaze Point on Handheld Devices,” M.S. thesis, Department of Automation and Systems Technology, Helsinki University of Technology, Espoo, Finland. https://www.semanticscholar.org/paper/Mobile-Usability-measuring-Gaze-Point-on-Handheld-Lukander-Nieminen/0323766878e4610ed804e9919730dff450de97ea
Cristino, F. , Mathôt, S. , Theeuwes, J. , and Gilchrist, I. D. , 2010, “ Scanmatch: A Novel Method for Comparing Fixation Sequences,” Behav. Res. Methods, 42(3), pp. 692–700. [CrossRef] [PubMed]
Duchowski, A. , 2007, Eye Tracking Methodology: Theory and Practice, Vol. 373, Springer-Verlag, London.
Duchowski, A. T. , Driver, J. , Jolaoso, S. , Tan, W. , Ramey, B. N. , and Robbins, A. , 2010, “ Scanpath Comparison Revisited,” Symposium on Eye-Tracking Research and Applications (ETRA), Austin, TX, Mar. 22–24, pp. 219–226.
Holmqvist, K. , Nyström, M. , Andersson, R. , Dewhurst, R. , Jarodzka, H. , and Van de Weijer, J. , 2011, Eye Tracking: A Comprehensive Guide to Methods and Measures, Oxford University Press, Oxford, UK.
Noton, D. , and Stark, L. , 1971, “ Scanpaths in Saccadic Eye Movements While Viewing and Recognizing Patterns,” Vision Res., 11(9), pp. 929–942. [CrossRef] [PubMed]
Salvucci, D. D. , and Goldberg, J. H. , 2000, “ Identifying Fixations and Saccades in Eye-Tracking Protocols,” Symposium on Eye Tracking Research and Applications (ETRA), Palm Beach Gardens, FL, Nov. 6–8, pp. 71–78.
Nyström, M. , and Holmqvist, K. , 2010, “ An Adaptive Algorithm for Fixation, Saccade, and Glissade Detection in Eyetracking Data,” Behav. Res. Methods, 42(1), pp. 188–204. [CrossRef] [PubMed]


Grahic Jump Location
Fig. 2

Detail of the grinding sample and force data collection module. Forces in three directions were measured: tangential (x-axis), normal (z-axis), and axial (y-axis). The reflective spheres are used by the motion tracking system.

Grahic Jump Location
Fig. 1

Setup of our grinding experiment. Data were collected from three separate modules: (1) gaze tracking consisting of SensoMotoric instruments eye-tracking glasses and a computer running iView recording software; (2) force measurement consisting of a triaxial load cell and a computer running labview; (3) motion capture system by Optitrack, which can determine the position and orientation of selected objects. The data collected from these three modules were synchronized and analyzed using the methods described in Sec. 3.

Grahic Jump Location
Fig. 4

Normalized histogram of the tangential and axial tool velocities for all subjects

Grahic Jump Location
Fig. 7

Sample of fixation points for a single subject. Positions are reported in pixels on the original 1280 by 960 pixel field of view. Notice the asymmetric dispersion of shifts.

Grahic Jump Location
Fig. 8

Distributions of the fixational variations for all the trials. The whiskers extend to the 90th percentile of the distribution. Outliers are represented as red crosses.

Grahic Jump Location
Fig. 9

(a) Plot of normal and tangential forces for all subjects. (b) Tangential and normal force distributions between subjects. The height of the bars represents the mean force for each trial with the standard deviations indicated.

Grahic Jump Location
Fig. 5

Comparison of skewed fixation distributions between subjects. A vertical line within the rectangular box indicates the median of the distribution. Outliers beyond the 90th percentile labeled with a cross.

Grahic Jump Location
Fig. 6

Comparison of modal responses in the gaze-motor behavior of all subjects

Grahic Jump Location
Fig. 10

Tangential force variation versus mass removal during grinding

Grahic Jump Location
Fig. 3

An example scanpath. The centers of fixations are denoted by points. The durations of fixations are represented by the diameters of the circles. The fixation centers are connected by straight lines according to their temporal order. Each straight line corresponds to a saccade.

Grahic Jump Location
Fig. 11

Normal force variation versus mass removal during grinding

Grahic Jump Location
Fig. 12

Average surface roughness variation versus mass removal during grinding

Grahic Jump Location
Fig. 13

Average surface roughness variation versus tangential force



Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In