New Douglas-Rashford Splitting Algorithms for Generalized DC Programming with Applications in Machine Learning

  • Yonghong Yao
  • , Lateef O. Jolaoso
  • , Yekini Shehu*
  • , Jen Chih Yao
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

In this work, we propose some new Douglas-Rashford splitting algorithms for solving a class of generalized DC (difference of convex functions) in real Hilbert spaces. The proposed methods leverage the proximal properties of the nonsmooth component and a fasten control parameter which improves the convergence rate of the algorithms. We prove the convergence of these methods to the critical points of nonconvex optimization under reasonable conditions. We evaluate the performance and effectiveness of our methods through experimentation with three practical examples in machine learning. Our findings demonstrated that our methods offer efficiency in problem-solving and outperform state-of-the-art techniques like the DCA (DC Algorithm) and ADMM.

Original languageEnglish
Article number88
JournalJournal of Scientific Computing
Volume103
Issue number3
DOIs
Publication statusPublished - Jun 2025

Keywords

  • DC programming
  • Douglas-Rachford splitting algorithm
  • Machine learning
  • Nonconvex optimization

Fingerprint

Dive into the research topics of 'New Douglas-Rashford Splitting Algorithms for Generalized DC Programming with Applications in Machine Learning'. Together they form a unique fingerprint.

Cite this