Adaptive Part Inspection Through Developmental Vision

[+] Author and Article Information
Gil Abramovich

Department of Mechanical Engineering, The University of Michigan, Ann Arbor, MI 48109gabramov@engine.umich.edu

Juyang Weng

Department of Computer Science and Engineering, Michigan State University, East Lansing, MI 48824weng@cse.msu.edu

Debasish Dutta

Department of Mechanical Engineering, The University of Michigan, Ann Arbor, MI 48109dutta@engine.umich.edu

J. Manuf. Sci. Eng 127(4), 846-856 (Mar 08, 2005) (11 pages) doi:10.1115/1.2039103 History: Received November 21, 2003; Revised March 08, 2005

We present a novel online inspection method for manufacturing processes that automatically adapts to variations in part and environmental properties. This method is based on a developmental learning architecture comprising a procedure that focuses attention to apparently defective regions, a recognition method that performs automatic feature derivation based on a set of training images and hierarchical classification, and an action step that controls attention and further decision processes. The method adapts to variations incrementally by updating rather than recreating the training information. Also, the method is capable of inspecting and training simultaneously. Addressing new inspection tasks requires neither re-programming and compatibility tests, nor quantitative knowledge about the image set, from a human developer. Instead, automatic or manual training of the inspection system according to simple guidelines is applied. These attributes allow the method to improve online performance with minimal ramp-up time. Our system performed inspection of three applications with low error rate and fast recognition, confirming its suitability for general-purpose, real-time, online inspection.

Copyright © 2005 by American Society of Mechanical Engineers
Your Session has timed out. Please sign back in to continue.



Grahic Jump Location
Figure 1

Dimensional landmarks: (a) Locating hole (left) on an engine head (right); (b) circular part—the notch can serve as an orientation and a positioning landmark

Grahic Jump Location
Figure 2

Arrangement of different applications according to position relevance

Grahic Jump Location
Figure 3

Automatic feature derivation

Grahic Jump Location
Figure 4

(a) Coarse clustering; (b) fine clustering. (a.1) and (b.1) represent the input space, (a.2) and (b.2) represent the output space, (a.3) and (b.3) represent the IHDR tree node space

Grahic Jump Location
Figure 5

A general-purpose developmental vision architecture, divided into sensory, cognitive, and motor mapping

Grahic Jump Location
Figure 6

Hierarchical sensory mapping

Grahic Jump Location
Figure 7

Positively separable regions: The black and white dots denote two different classes. d is the minimal separation zone breadth

Grahic Jump Location
Figure 8

Sample space and decision boundaries: (a) Separable classes; (b) same classes, with the introduction of variation in a specific property; (c) same as (b), with separation into subclasses resulting in low-degree boundaries between subclasses

Grahic Jump Location
Figure 9

(a) Experimental setup. (b) acquisition strategy

Grahic Jump Location
Figure 10

Three case studies: (a) textural surface defects; (b) dimensional landmarks—notch of a piston ring; (c) object orientation detection

Grahic Jump Location
Figure 11

Partial versions of our architecture: (a) architecture for surface defect inspection by texture analysis and object positioning; (b) architecture for landmark detection and location; (c) landmark detection by active vision with both low level "innate" position information and a developmental appearance-based classifier




Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In