Optimization Algorithms for High-Dimensional Data: Applications in Scientific Computing
DOI:
https://doi.org/10.63282/3050-9262.IJAIDSML-V2I1P101Keywords:
High-dimensional optimization, Gradient descent, L-BFGS, Stochastic gradient descent, Adaptive learning rates, Dimensionality reduction, Parallel optimization, Robustness, Computational efficiency, Machine learningAbstract
High-dimensional data is increasingly prevalent in scientific computing, driven by advancements in data collection technologies and the growing complexity of scientific models. Traditional optimization algorithms often struggle with the curse of dimensionality, leading to inefficiencies and suboptimal solutions. This paper explores advanced optimization algorithms tailored for high-dimensional data, focusing on their applications in scientific computing. We discuss the challenges posed by high-dimensional data, review state-of-the-art optimization techniques, and present case studies from various scientific domains. The paper also includes algorithmic details, performance evaluations, and future research directions
References
[1] Bishop, C. M. (2006). Pattern Recognition and Machine Learning. Springer.
[2] Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep Learning. MIT Press.
[3] Nocedal, J., & Wright, S. J. (2006). Numerical Optimization. Springer.
[4] Pedregosa, F., et al. (2011). Scikit-learn: Machine Learning in Python. Journal of Machine Learning Research, 12, 2825-2830.
[5] Tipping, M. E. (2001). Sparse Bayesian Learning and the Relevance Vector Machine. Journal of Machine Learning Research, 1, 211-244.
[6] Scientific Reports. https://www.nature.com/articles/s41598-023-43748-w
[7] École Polytechnique Fédérale de Lausanne. https://www.epfl.ch/labs/mathicse/wp-content/uploads/2018/10/05.2015_MS.pdf
[8] Mathematical Problems in Engineering. https://onlinelibrary.wiley.com/doi/10.1155/2020/5264547
[9] Scientific Reports. https://www.nature.com/articles/s41598-023-42969-3
[10] Proceedings of the Neural Information Processing Systems (NeurIPS). http://papers.neurips.cc/paper/9267-high-dimensional-optimization-in-adaptive-random-subspaces.pdf
[11] École Polytechnique Fédérale de Lausanne. https://sma.epfl.ch/~anchpcommon/publications/ttcompletion.pdf
[12] Random matrix theory & high-dimensional optimization – Lecture 11 [Video]. MathTube. https://mathtube.org/lecture/video/random-matrix-theory-high-dimensional-optimization-lecture-11
[13] ResearchGate. https://www.researchgate.net/publication/292383665_High-performance_scientific_computing_Algorithms_and_applications