A Data Mining Approach for Generation of Control Signatures

[+] Author and Article Information
Andrew Kusiak

Intelligent Systems Laboratory, Department of Mechanical and Industrial Engineering, 2139 Seamans Center, The University of Iowa, Iowa City, Iowa 52242-1527 e-mail: andrew-kusiak@uiowa.edu http://www.icaen.uiowa.edu/∼ankusiak

J. Manuf. Sci. Eng 124(4), 923-926 (Oct 23, 2002) (4 pages) doi:10.1115/1.1511524 History: Received February 01, 2001; Revised January 01, 2002; Online October 23, 2002
Copyright © 2002 by ASME
Your Session has timed out. Please sign back in to continue.


Michie, D., Spiegelhalter, D. J., and Taylor, C. C., 1994, Machine Learning, Neural, and Statistical Classification, Ellis Horwood, New York.
Domingos, P., and Pazzani, M., 1996, “Beyond Independence: Conditions for the Optimality of the Simple Bayesian Classifier,” Machine Learning: Proceedings of the Thirteenth International Conference, Morgan Kaufmann, Los Altos, CA, pp. 105–112.
Vapnik, V. N., 2000, The Nature of Statistical Learning Theory, Springer, New York.
Quinlan,  J. R., 1986, “Induction of Decision Trees,” Mach. Learn., 1(1), pp. 81–106.
Clark,  P., and Boswell,  R., 1989, “The CN2 induction algorithm,” Mach. Learn., 3(4), pp. 261–283.
Quinlan, J. R., 1993, C4.5: Programs for Machine Learning, Morgan Kaufmann, Los Altos, CA.
Auer, P., Holte, R., and Maass, W., 1995, “Theory and Application of Agnostic PAC-Learning with Small Decision Trees,” A. Prieditis and S. Russell, eds, ECML-95: Proceedings of 8th European Conference on Machine Learning, Springer Verlag, New York.
Friedman, J., Yun, Y., and Kohavi, R., 1996, “Lazy Decision Trees,” Proceedings of the Thirteenth National Conference on Artificial Intelligence, AAAI Press and MIT Press.
Kohavi, R., 1995, “Wrappers for Performance Enhancement and Oblivious Decision Graphs,” Ph.D. Thesis, Computer Science Department, Stanford University, Stanford, CA.
Aha,  D. W., 1992, “Tolerating Noisy, Irrelevant and Novel Attributes in Instance-Based Learning Algorithms,” Int. J. Man-Mach. Stud., 36(2), pp. 267–287.
Michalski, R. S., Bratko, I., and Kubat, M., eds., 1998, Machine Learning and Data Mining, John Wiley, New York.
Michalski, R. S., Mozetic, I., Hong, J., and Lavrac, N., 1986, “The Multi-Purpose Incremental Learning System AQ15 and Its Testing Application to Three Medical Domains,” Proceedings of the 5th National Conference on Artificial Intelligence, AAAI Press, Palo Alto, CA, pp. 1041–1045.
Grzymala-Busse,  J. W., 1997, “A New Version of the Rule Induction System LERS,” Fundamenta Informaticae, 31, pp. 27–39.
Pawlak, Z., 1991, Rough Sets: Theoretical Aspects of Reasoning About Data, Kluwer, Boston, MA.
Brooker, L. B., 1989, “Triggered Rule Discovery in Classifier Systems,” Schaffer, J. D., ed., Proceedings of the 3rd International Conference on Genetic Algorithms, ICGA89), Morgan Kaufmann, San Mateo, CA, pp. 265–274.
Donnart, J. Y., and Meyer, J. A., 1994, “A Hierarchical Classification System Implementing a Motivationally Autonomous Aninmat,” Cliff, D., Husbands, P., Meyer, J. A., and S. W. Wilson, eds, Proceedings of the Third International Conference on Simulation of Adaptive Behavior, SAB94, MIT Press, Cambridge, MA, pp. 144–153.
Wilson,  S. W., 1995, “Classifier Fitness Based on Accuracy,” Evol. Comput., 3(2), pp. 149–175.
Agraval, R., and Srikant, R., 1994, “Fast Algorithms for Mining Association Rules in Large Data Bases,” IBM Research Report No. RJ 9839, Almaden Research Center, San Jose, CA.
Lim,  T.-S., Loh,  W.-Y., and Shih,  Y.-S., 2000, “A Comparison of Prediction Accuracy, Complexity, and Training Time of Thirty-Three Old and New Classification Algorithms,” Mach. Learn., 40, pp. 203–228.
Cherkassky, V., and F. Mulier, 1998, Learning from Data—Concepts, Theory, and Methods, John Wiley, New York.
Lanzi, P. L., Stoltzmann, W., and Wilson, S. W., eds., 2000, Learning Classifier Systems: From Foundations to Applications, Springer, New York.
Holland, J. H., 1975, Adaptation in Natural and Artificial Systems, University of Michigan Press, Ann Arbor, MI.
Goldberg, D. E., 1989, Genetic Algorithms in Search, Optimization, and Machine Learning, Addison-Wesley, Reading, MA.
Han, J., and Kamber, M., 2001, Data Mining: Concepts and Techniques, Academic Press, San Diego, CA.
Kusiak,  A., 2001, “Rough Set Theory: A Data Mining Tool for Semiconductor Manufacturing,” IEEE Transactions on Electronics Packaging Manufacturing, 24(1), pp. 44–50.
Kusiak, A., 2000, Computational Intelligence in Design and Manufacturing, John Wiley, New York.
Kusiak,  A., and Kurasek,  C., 2001, “Data Mining Analysis of Printed-Circuit Board Defects,” IEEE Trans. Rob. Autom., 17(2), pp. 191–196.
Kusiak,  A., 2001, “Feature Transformation Methods in Data Mining,” IEEE Transactions on Electronics Packaging Manufacturing, 24(3), pp. 214–221.


Grahic Jump Location
Data sets for a two-stage process



Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In