ABSTRACT
Objective optimization is crucial in industrial production. However, for high-dimensional data, it often needs to take a long time and is full of challenges to get an approximate optimal solution which is very close to the optimum. Therefore, an approach to solve objective optimization problems of high-dimensional data is proposed by combining gradient descent and clustering, with simulated annealing. Considering the optimization problem underneath coal blending in electric field as a background, the proposed approach is compared with the traditional simulated annealing algorithm and that combined with gradient descent. Experiments shows that the proposed approach can improve the average matching degree by 1.94% and 2.14%, and the best matching degree by 0.08% and 0.09%, compared to that based on the traditional simulated annealing algorithm and that combined with superimposed gradient descent respectively. Experiments and analyses also show that the proposed approach can reduce time consumption largely, and has a great improvement in stability, accuracy of the optimal solution as well as the efficiency of solution searching.
- J.-S. Shih and H. C. Frey, "Coal blending optimization under uncertainty," European Journal of Operational Research, vol. 83, no. 3, pp. 452-465, 1995.Google ScholarCross Ref
- G. Xi-jin, C. Ming, and W. Jia-wei, "Coal blending optimization of coal preparation production process based on improved GA," Procedia Earth and Planetary Science, vol. 1, no. 1, pp. 654-660, 2009.Google ScholarCross Ref
- A. Schrijver, "On the history of combinatorial optimization (till 1960)," Handbooks in operations research and management science, vol. 12, pp. 1-68, 2005.Google Scholar
- T. Siddiqui, M. A. Aalam, and S. Jain, "Discovery of Scalable Association Rules from Large Set of Multidimensional Quantitative Datasets," Journal of Advances in Information Technology, vol. 3, no. 1, 2012.Google ScholarCross Ref
- R. A. Rutenbar, "Simulated annealing algorithms: An overview," IEEE Circuits and Devices magazine, vol. 5, no. 1, pp. 19-26, 1989.Google ScholarCross Ref
- P. J. Van Laarhoven and E. H. Aarts, "Simulated annealing," in Simulated annealing: Theory and applications: Springer, 1987, pp. 7-15.Google Scholar
- S. Ruder, "An overview of gradient descent optimization algorithms," arXiv preprint arXiv:1609.04747, 2016.Google Scholar
- S. Phithakkitnukoon and C. Ratti, "A recent-pattern biased dimension-reduction framework for time series data," Journal of Advances in Information Technology, vol. 1, no. 4, pp. 168-180, 2010.Google ScholarCross Ref
- K. Hammouda and F. Karray, "A comparative study of data clustering techniques," University of Waterloo, Ontario, Canada, vol. 1, 2000.Google Scholar
- K. Deb, "Multi-objective optimization," in Search methodologies: Springer, 2014, pp. 403-449.Google Scholar
- M. Dorigo and T. Stützle, "The ant colony optimization metaheuristic: Algorithms, applications, and advances," in Handbook of metaheuristics: Springer, 2003, pp. 250-285.Google Scholar
- G. Reinelt, The traveling salesman: computational solutions for TSP applications. Springer, 2003.Google Scholar
- H. R. Lewis and C. H. Papadimitriou, "Elements of the Theory of Computation," ACM SIGACT News, vol. 29, no. 3, pp. 62-78, 1998.Google ScholarDigital Library
- M. Srinivas and L. M. Patnaik, "Genetic algorithms: A survey," computer, vol. 27, no. 6, pp. 17-26, 1994.Google Scholar
- K. Deb, "Multi-objective optimisation using evolutionary algorithms: an introduction," in Multi-objective evolutionary optimisation for product design and manufacturing: Springer, 2011, pp. 3-34.Google Scholar
- S. Kirkpatrick, C. D. Gelatt, and M. P. Vecchi, "Optimization by simulated annealing," science, vol. 220, no. 4598, pp. 671-680, 1983.Google Scholar
- A. K. Jain, M. N. Murty, and P. J. Flynn, "Data clustering: a review," ACM computing surveys (CSUR), vol. 31, no. 3, pp. 264-323, 1999.Google ScholarDigital Library
- A. Joshi and N. Haspel, "Clustering of protein conformations using parallelized dimensionality reduction," Journal of Advances in Information Technology, 2019.Google ScholarCross Ref
- A. Likas, N. Vlassis, and J. J. Verbeek, "The global k-means clustering algorithm," Pattern recognition, vol. 36, no. 2, pp. 451-461, 2003.Google ScholarCross Ref
- J. A. Hartigan and M. A. Wong, "AK‐means clustering algorithm," Journal of the Royal Statistical Society: Series C (Applied Statistics), vol. 28, no. 1, pp. 100-108, 1979.Google ScholarCross Ref
- S. Du, J. Lee, H. Li, L. Wang, and X. Zhai, "Gradient descent finds global minima of deep neural networks," in International Conference on Machine Learning, 2019: PMLR, pp. 1675-1685.Google Scholar
- J. D. Lee, M. Simchowitz, M. I. Jordan, and B. Recht, "Gradient descent converges to minimizers," arXiv preprint arXiv:1602.04915, 2016.Google Scholar
- J. Han, M. Kamber, and J. Pei, "Data mining concepts and techniques third edition," The Morgan Kaufmann Series in Data Management Systems, vol. 5, no. 4, pp. 83-124, 2011.Google Scholar
- O. A. Abbas, "Comparisons between data clustering algorithms," International Arab Journal of Information Technology (IAJIT), vol. 5, no. 3, 2008.Google Scholar
Index Terms
- An Approach to Objective Optimization of High-dimensional Data
Recommendations
Multi-clustering via evolutionary multi-objective optimization
The parallelism feature of evolutionary multi-objective optimization (EMO) can be used to search for multiple clustering results simultaneously.An a posteriori method, EMO-KC, is proposed to identify an appropriate cluster number.A transformation ...
An economical cognitive approach for bi-objective optimization using bliss points, visualization, and interaction
AbstractWhen an optimization problem encompasses multiple objectives, it is usually difficult to define a single optimal solution. The decision maker plays an important role when choosing the final single decision. Pareto-based evolutionary multiobjective ...
Solving high-dimensional multi-objective optimization problems with low effective dimensions
AAAI'17: Proceedings of the Thirty-First AAAI Conference on Artificial IntelligenceMulti-objective (MO) optimization problems require simultaneously optimizing two or more objective functions. An MO algorithm needs to find solutions that reach different optimal balances of the objective functions, i.e., optimal Pareto front, therefore,...
Comments