skip to main content
10.1145/3305275.3305328acmotherconferencesArticle/Chapter ViewAbstractPublication PagesisbdaiConference Proceedingsconference-collections
short-paper

Research on Development and Application of Support Vector Machine - Transformer Fault Diagnosis

Published: 29 December 2018 Publication History

Abstract

Support Vector Machine (SVM) is a machine learning method based on statistical learning theory, solving the problems of classification and regression by means of optimization methods. The method can effectively solve the problem of small number of samples, nonlinearity and high dimensionality, and largely avoids the problems of "dimensionality disaster", "over-fitting" and local minimum caused by traditional statistical theory. However, there are still some problems, such as high complexity of the algorithm and difficulty in adapting to large-scale data. The article systematically introduces the theory of support vector machine, summarizes the common training algorithms of standard (traditional) support vector machine and their existing problems, the new learning models and algorithms developed on this basis. And verify the actual effect and scope of each support vector machine model through the application of transformer fault diagnosis.

References

[1]
Vapnik V N (2004). Statistical Learning Theory{M}. Translated by XU Jianhua, ZHANG Xuegong. Beijing: Publishing House of Electronics Industry.
[2]
Vapnik V N (2000). The nature of statistical learning theory{M}. Translated by Zhang Xuegong. Beijing: Tsinghua University Press.
[3]
Vapnik V N (1999). The Nature of Statistical Learning Theory {M}. 2nd edition. New York: Springer - Verlag.
[4]
Suykens J A K, Vandewalle J (1999). Least squares support vector machines{J}. Neural Processing Letters, 9(3): 293--300.
[5]
Collobert R, Sinz F, Weston J, et al (2006). Trading convexity for scalability {C}. Proceedings of Proceedings of the 23rd international conference on Machine learning. ACM.201--208.
[6]
An L T H, Tao P (2005). The DC ( dierence of convex functions )programming and DCA revisited with DC models of real world nonconvex optimization problems {J}. Annals of Operations Research, 133:23--46.
[7]
Li Xuehua, Shu Lan (2008). Fuzzy theory based support vector machine classifier{C}//Proceedings of the Fifth International Conference on Fuzzy Systems and Knowledge Discovery. Shandong, China: {s.n.}, 600--604.
[8]
Hao Peiyi, Lin Minshiu, Tsai Lung-biao (2008). A new support vector machine with fuzzy hyper-plane and its application to evaluate credit risk{C}//Proceedings of the eighth international conference on intelligent systems design and applications. Taiwan, China: {s.n.}, 3: 83--88.
[9]
Scholkopf B, Smola A J, William so n R C, et al (2000). New support vector algorithms {J}. Neural Computation, 12(5): 1207--1245.
[10]
Mangasarian O L (2000). Generalized support vector machines {A}. Advances in Large Margin Classifiers{C}. Cambridge: M IT Press. 135--146.
[11]
Tang Yuchun, Jin Bo, Zhang Yanqing et al (2004). Granular support vector machines for medical binary classification problems{C}//Proceedings of the IEEE CIBIB. Piscataway, HJ: IEEE Computationl Intelligence Society, 73--78.
[12]
Jayadeva, Khemchandni R, Chandras (2007). Twin support vector machines for pattern classification{J}, IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(5):905--910.
[13]
Platt J (1999). Fast training of support vector machines using sequential minimal optimization {A}. Advances in Kernel Method s --- Support Vector Learning {C}. Cambridge: M IT Press. 185--208.
[14]
Scholkopf B, Smola A, Muller K R (1998). Nonlinear component analysis as a kernel eigenvalue problem {J}. Neural Computation, 10( 5): 1299--1319.
[15]
Cortes C, Vapnik V (1995). Support-vector networks{J}. Machine learning, 20(3):273--297.
[16]
Deng N Y, Tian Y J (2004). New Method in Data Mining: Support Vector Machines{M}. Beijing, China: Science Press.
[17]
Deng N Y, Tian Y J (2009). Support Vector Machines, Theory, Algorithms and Extenions{M}.Science Press.
[18]
Shawe Taylor J, Cristianini N (2004). Kernel methods for pattern analysis{M}. Canbridge university press.
[19]
Hsu C W, Chang C C, C J Lin, et al (2009). A practical guide to support vector classification{J}. Bioinformatics, 1(1):1--15.
[20]
Wang L, Jia H D, Li J (2008). Training robust support vector machine with smooth Ramp loss in the primal space{J}. Neurocomputing, 71:3020--3025.
[21]
Suykens J A K, Lukas L, Vandewalle J (2000). Sparse least squares support vector machines classifiers{A}. The 8th European Symposium on Artificial Neural Networks{C}. Brugers, 37--42.
[22]
Li Daoguo, Miao Duanqian, Zhang Dongxing, et al (2005). Review of Granular Computing Research{J}. Computer Science, 32(9): 1--12.
[23]
Cheng Wei, Shi Yang, Zhang Yanping (2007). Three Main Methods of Particle Size Calculation{J}. Computer Technology and Development, 17(3): 91--94.
[24]
Ding Shifei, Xu Li, Zhu Hong et al (2010). Research and progress of cluster algorithms based on granular computing{J}. International Journal of Digital Content Technology and its Applications, 4(5): 96--104.
[25]
Tang Yuchun, Jin Bo, Zhang Yanqing (2005). Granular support vector machines with association rules mining for protein homology prediction{J}. Artificial Intelligence in Medicine, 35: 121--134.
[26]
Zhang Wenhao, Wang Wenjian (2009). A Kernel Granular Support Vector Machine Based on Association Rules{J}. Journal of Guangxi Normal University(Natural Science), 27(3): 89--92.
[27]
Wang Wenjian, Xu Zongben (2004). A heuristic training in support vector regression{J}. Neurocomputing, 61: 259--275.
[28]
Zhang Xin, Wang Wenjian (2008). A Learning Strategy Based on Granular Support Vector Machine{J}. Computer Science, 35(8a): 101--103, 116.
[29]
Duan Danqing, Chen Songqiao, Yang Weijun, et al (2008). Detection of Intrusion Using Rough Sets and Support Vector Machines{J}. Journal of Chinese Computer Systems, 29(4): 627--630.
[30]
Li Ye, Cai Yunze, Li Yuangui, et al (2004). Rough sets method for SVM data preprocessing{C}//IEEE Conference on Cybernetics and Intelligent Systems. Singapore: {s.n.}, 1039--1042.
[31]
Guo Husheng, Wang Wenjian, Men Changqian (2009). A novel learning model-kernel granular support vector machine{C}//Proceeding of the eighth international conference on machine learning and cybernetic. Baoding: {s.n.}, 930--935.
[32]
Wang Wenjian, Guo Husheng (2009). Granular Support Vector Machine Learning Model{J}. Journal of Shanxi University: Natural Science Edition, 32( 4): 535--540.
[33]
Zhang Wenhao, Wang Wenjian (2009). A Kernel Granular Support Vector Machine Based on Association Rules{J}. Journal of Guangxi Normal University: Natural Science, 27(3): 89--92.
[34]
Wen Guihua, Xiang Jun, Ding Yuehua (2008). Large-scale SVM classification algorithm based on quotient space granularity theory {J}. Computer Application Research, 25(8): 2299--2301.
[35]
Cheng Wei, Zhang Yanping, Zhao Wei (2009). SVM production forecasting model research under the framework of quotient space theory {J}. China Agricultural University, 14(5): 135--139.
[36]
Lian Ke, Huang Jianguo, Wang Houjun, et al (2008). A Multi-class Strategy for SVM Decision Tree Based on Genetic Algorithm{J}. Journal of Electronics, 36(8): 1502--1507.
[37]
Guo Husheng, Wang Wenjian (2009). SVM learning algorithm based on neural network{J}. Computer Engineering and Applications, 945(2): 51--54.
[38]
Shifei Ding, Junzhao Yu, Bingjuan Qi, Huajuan Huang (2014). An overview on twin support vector machines{J}. Artificial Intelligence Review. (2).
[39]
Peng X J (2010). TSCR: an efficient twin support vector machine for regression{J}}. Neural Networks, 23(3):365--372.
[40]
Yu J Z, Ding S F, Jin F, et al (2012). Twin support vector machines based on rough sets{J}.International Journal of Digital Content Technology & Its Applications, 6(20):493--500.
[41]
Ding S F, Yu J Z, Huang H J, et al (2013). Twin support vector machines based on particle swarm optimization{J}. Journal of Computers, 8(9):2296--2303.
[42]
Ding S F, Zhang X K, Yu J Z (2016). Twin support vector machines based on fruit fly optimization algorithm{J}. International Journal of Machine Learning and Cybernetics, 7(2):193--203.
[43]
Wu F L, Ding S F (2014). Twin support vector machines based on the mixed kernel function{J}. Journal of Computers, 9(7):1690--1696.
[44]
Ding S F, Wu F L, Shi Z Z. Wavelet twin support vector machine{J}. Neural Computing and Applications, 2014, 25(6): 1241--1247.
[45]
Liu D L, Shi Y, Tian Y J (2016). Ramp Loss Least Squares Support Vector Machine, Journal of Computational Science, 14:61--68.
[46]
Liu D L, Chen D D, Shi Y, Tian Y J (2016). Ramp Loss Linear Programming Nonparallel Support Vector Machine, Procedia Computer Science (International conference of computational Science, 2016), 80:1745--1754.
[47]
Ding S F, Yu J Z, Qi B J, et al (2014). An overview on twin support vector machines{J}. Artificial Intelligence Review, 42(2):245--252.
[48]
Yang Z X, Shao Y H, Zhang X S (2013). Multiple birth support vector machine for multi-class classification{{J}. Neural Computing and Applications, 22(1):153--161.
[49]
Zhang X K, Ding S F, Xue Y (2017). An improved multiple birth support vector machine for pattern classification. Neurocomputing.
[50]
Yitian Xu, Rui Guo (2014). A twin hyper-sphere multi-class classification support vector machine{J}. Journal of Intelligent & Fuzzy Systems. (4).
[51]
Jalal A. Nasiri, Nasrollah Moghadam Charkari, Saeed Jalili (2014). Least squares twin multi-class classification support vector machine{J}. Pattern Recognition.
[52]
Márcio Dias de Lima, Nattane Luiza Costa, Rommel Barbosa (2018). Improvements on least squares twin multi-class classification support vector machine{J}. Neurocomputing.
[53]
Zhonghua Yan (2018). Artificial Bee Colony Constrained Optimization Algorithm with Hybrid Discrete Variables and its Application. Acta Electronica Malaysia, 2(1) 18--20.
[54]
Aida Mustapha, Shazwani Mustapa, Nurfarahim Md.Azlan, Noor Fatin Ishma Saifarrudin, Shahreen Kasim, Mohd Farhan Md Fudzee, Azizul Azhar Ramli, Hairulnizam Mahdin, Seah Choon Sen (2017). A classification approach for naïve bayes of online retailers. Acta Informatica Malaysia, 1(1): 26--28.

Cited By

View all
  • (2022)Automatic classification of OER for metadata quality assessment2022 International Conference on Advanced Learning Technologies (ICALT)10.1109/ICALT55010.2022.00011(16-18)Online publication date: Jul-2022

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
ISBDAI '18: Proceedings of the International Symposium on Big Data and Artificial Intelligence
December 2018
365 pages
ISBN:9781450365703
DOI:10.1145/3305275
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

In-Cooperation

  • International Engineering and Technology Institute, Hong Kong: International Engineering and Technology Institute, Hong Kong

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 29 December 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Statistical learning theory
  2. fault diagnosis
  3. learning algorithm
  4. loss function
  5. support vector machine
  6. transformer

Qualifiers

  • Short-paper
  • Research
  • Refereed limited

Conference

ISBDAI '18

Acceptance Rates

ISBDAI '18 Paper Acceptance Rate 70 of 340 submissions, 21%;
Overall Acceptance Rate 70 of 340 submissions, 21%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)7
  • Downloads (Last 6 weeks)0
Reflects downloads up to 20 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2022)Automatic classification of OER for metadata quality assessment2022 International Conference on Advanced Learning Technologies (ICALT)10.1109/ICALT55010.2022.00011(16-18)Online publication date: Jul-2022

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media