تعداد نشریات | 11 |
تعداد شمارهها | 207 |
تعداد مقالات | 2,075 |
تعداد مشاهده مقاله | 2,812,470 |
تعداد دریافت فایل اصل مقاله | 2,030,358 |
A Clustering Approach by SSPCO Optimization Algorithm Based on Chaotic Initial Population | ||
Journal of Electrical and Computer Engineering Innovations (JECEI) | ||
مقاله 5، دوره 4، شماره 1 - شماره پیاپی 7، فروردین 2016، صفحه 31-38 اصل مقاله (805.96 K) | ||
شناسه دیجیتال (DOI): 10.22061/jecei.2016.531 | ||
نویسندگان | ||
R. Omidvar* 1؛ H. Parvin2؛ A. Eskandari3 | ||
1Young Researchers and Elite Club, Yasooj Branch, Islamic Azad University, Yasooj, Iran | ||
2Young Researchers and Elite Club, Nourabad Mamasani Branch, Islamic Azad University, Nourabad Mamasani, Iran | ||
3Sama Technical and Vocational Training College, Azad University of Shiraz, Shiraz, Iran | ||
تاریخ دریافت: 02 مرداد 1395، تاریخ بازنگری: 19 شهریور 1395، تاریخ پذیرش: 19 شهریور 1395 | ||
چکیده | ||
Assigning a set of objects to groups such that objects in one group or cluster are more similar to each other than the other clusters’ objects is the main task of clustering analysis. SSPCO optimization algorithm is a new optimization algorithm that is inspired by the behavior of a type of bird called see-see partridge. One of the things that smart algorithms are applied to solve is the problem of clustering. Clustering is employed as a powerful tool in many data mining applications, data analysis, and data compression in order to group data on the number of clusters (groups). In the present article, a chaotic SSPCO algorithm is utilized for clustering data on different benchmarks and datasets; moreover, clustering with artificial bee colony algorithm and particle mass 9 clustering technique is compared. Clustering tests have been done on 13 datasets from UCI machine learning repository. The results show that clustering SSPCO algorithm is a clustering technique which is very efficient in clustering multivariate data. | ||
کلیدواژهها | ||
SSPCO algorithm؛ Chaotic؛ Clustering؛ Initial Population؛ Data set | ||
مراجع | ||
[1] R. Rajabioun, “Cuckoo Optimization Algorithm,” Applied Soft Computing, 2011.
[2] J. Kennedy, and R. Eberhart, “Particle Swarm Optimization,” Proceedings of IEEE International Conference on Neural Networks, 1995.
[3] M. Dorigo, M. Birattari, and T. Stutzle, “Ant Colony Optimization: Artificial Ants as a Computational Intelligence Technique,” IEEE Computational Intelligence Magazine, 2006.
[4] S. Arora, and S. Singh, “The Firefly Optimization Algorithm: Convergence Analysis and Parameter Selection,” International Journal of Computer Applications, Vol. 69. 3, 2013.
[5] L. Fister, L. Fister Jr, X. She yang, and J. Brest, “A comprehensive review of firefly algorithm,” Swarm and Evolutionary Computation, vol. 13, pp. 34-46, Des. 2013.
[6] D. Karaboga, and B. Basturk, “On the performance of artificial bee colony algorithm,” Applied Soft Computing, vol. 8, 2008.
[7] D. Pham, A. Ghanbarzadeh, A. Koc, S. Otri, S. Rahim, and M. Zaidi, “The bees algorithm, Technical note, Cardiff university,” UK: Manufactoring Engineering center, 2005.
[8] J. Han, and M. Kamber, “Data mining: Concept and Techniques,”Morgan Kaufmann publisher, 2001.
[9] D. J. Hand, H. Mannila, and P. Smyte, “Principles of Data Mining,” The MIT Press, 2001.
[10] M.P. Veyssieres, and R.E. Plant , “Identification of vegetation state and transition domains in California’s hardwood rangelands,” University of California, 1998.
[11] R. Xu, and D. Wunsch, “Survey of Clustering Algorithms,” IEEE TRANSACTIONS ON NEURAL NETWORKS, vol. 16. 3, 2005.
[12] A. Barladi, E. Alpaydin, “Constructive feedforward ART clustering networks,” Part I and II. IEEE Trans. Neural Netw, vol. 13. 3, pp. 662 – 677, May. 2002.
[13] V. Cherkassky, and F. Mulier, “Learning From Data: Concepts, Theory, and Methods,” New York : Wiley, 1998.
[14] A.K. Jain, M.N. Murty, and P.J. Flynn, “Data clustering: A review,” ACM Comput. Surv, vol. 31. 3, 1999.
[15] L. Rokach, “A survey of Clustering Algorithms,” Data Mining and Knowledge Discovery Handbook, 2nd ed. Springer Science. 10.1007/978-0-387-09823-4_14, 2010 .
[16] Y. Marinakis, M. Marinaki, M. Doumpos, N. Matsatsinis, and C. Zopounidis, “A hybrid stochastic genetic—GRASP algorithm for clustering analysis,” Oper. Res. Int. J.(ORIJ) , vol. 8. 1, 2008.
[17] D. Karaboga, and C. Ozturk, “A novel clustering approach: Artificial Bee Colony (ABC) algorithm,” Applied Soft Computing, Elsevier, 10.1016/j.asoc.12.025, 2009.
[18] C.L. Blake, and C.J. Merz. The University of California at Irvine Repository of Machine, http://www.ics.uci.edu/ mlearn/MLRepository., 1998.
[19] I. De Falco, A. Della Cioppa, and E. Tarantino, “Facing classification problems with Particle Swarm Optimization,” Appl. Soft Comput, vol. 7. 3, pp. 652-658, 2007.
[20] F. Jensen, “An Introduction to Bayesian Networks,” UCL Press/Springer–Verlag, 1996.
[21] D.E. Rumelhart, G.E. Hinton, and R.J. Williams, “Learning representation by backpropagation errors”, Nature, 323(9), pp. 533-536, 1986.
[22] M H. Hassoun, “Fundamentals of Artificial Neural Networks,” The MIT Press, Cambridge, MA, 1995.
[23] J.C. Cleary, and L.E. Trigg, “An instance-based learner using an entropic distance measure,” Proceedings of the 12th International Conference on Machine Learning. pp. 108–114, 1995.
[24] L. Breiman, “Bagging predictors,” Mach. Learn, vol. 24. 2, pp.123-140, 1996.
[25] G.I. Webb, “Multiboosting: a technique for combining boosting and wagging,” Mach. Learn, vol. 40. 2, pp. 159-196, 2000.
[26] R. Kohavi, “Scaling up the accuracy of naive-Bayes classifiers: a decision tree hybrid, in: E. Simoudis, J.W. Han, U. Fayyad (Eds.),” Proceedings of the Second International ConferenceonKnowledge Discovery and Data Mining, AAAI Press. pp. 202–207, 1996.
[27] P. Compton, and R. Jansen, “Knowledge in context: a strategy for expert system maintenance, in: C.J., Barter, M.J., Brooks (Eds.),” Proceedings of Artificial Intelligence LNAI, Berlin, Springer–Verlag, Adelaide, Australia, vol. 406. pp. 292–306, 1988.
[28] G. Demiroz, and A. Guvenir, “Classification by voting feature intervals,” Proceedings of the Seventh European Conference on Machine Learning, pp. 85–92, 1997.
[29] D. Rumelhart, E. Hinton, and J. Williams, Learning internal representation by error propagation, “Parallel Distribute Processing,” vol. 1, pp. 318-362, 1986.
[30] M. B. Menhaj, Principles of Neural Networks, Amirkabir University of Technology, second edition, pp.715, 2002.
[31] R. Omidvar, H. Parvin, and F. Rad, “SSPCO Optimization Algorithm (See-See Partridge Chicks Optimization),” 14 th-Mexican international conferences on artificial intelligence, IEEE, 2015.
[32] Statistical Consultant for Doctoral Students and Researchers, http://www.statisticallysignificantconsulting.com/Ttest.htm.
[33] J. K. Kruschke, “Bayesian estimation supersedes the t test,” Journal of Experimental Psychology: General Version of May 31, 2012.
[34] J. C. F. De. Winter, “Using the Student’s t-test with extremely small sample sizes,” Practical Assessment, Research & Evaluation, vol 18, no 10, 2013.
[35] Y. He, J. Zhou, X. Xiang, H. Chen, and H. Qin, “Comparison of different chaotic maps in particle swarm optimization algorithm for long-term cascaded hydroelectric system scheduling,” Chaos Solitons Fractals 2009;42:3169-76.
[36] L. Coelho, and V. Mariani, “Use of chaotic sequences in a biologically inspired algorithm for engineering design optimization,” Expert Syst Appl 2008;34:1905-13.
[37] H. Gao, Y. Zhang, S. Liang, and D. Li, “A new chaotic algorithm for image encryption,” Chaos Solitons Fractals 2006;29:393-9.
[38] D. Kuo, Chaos and its computing paradigm. IEEE Potentials Mag 2005;24:13-5.
[39] J. Nayak, B. Naik, and H.S. Behera, “Fuzzy C-Means (FCM) lustering algorithm: a decade review from 2000 to 2014,” Comput. Intell. Data Min, vol. 2, pp. 133–149 (2014).
[40] J. Nayak, M. Nanda, K. Nayak, B. Naik, and H.S. Behera, “An improved firefly fuzzy c-means (FAFCM) algorithm for clustering real world data sets,” Smart Innov. Syst. Technol. Vol 27, pp. 339– 348, 2014.
[41] X. Wu,B. Wu, J. Sun, S. Qiu, and X. Li, “A hybrid fuzzy Kharmonic means clustering algorithm,” Appl. Math. Model. vol 39(12), pp. 3398–3409, 2015.
[42] S. Shamshirband, A. Amini, N B. Anuar, L M. Kiah, “D-FICCA: a density-based fuzzy imperialist competitive clustering algorithm for intrusion detection in wireless sensor networks,” Measurement, 55, pp. 212–226, 2014. | ||
آمار تعداد مشاهده مقاله: 1,183 تعداد دریافت فایل اصل مقاله: 1,261 |