IMPROVED SUBGRADIENT EXTRAGRADIENT METHODS WITH SELF-ADAPTIVE STEPSIZES FOR VARIATIONAL INEQUALITIES IN HILBERT SPACES

L. O. Jolaoso, X. Qin, Y. Shehu*, J. C. Yao

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

In this paper, we propose improved subgradient extragradient algorithms in terms of step sizes to solve pseudo-monotone variational inequalities in real Hilbert spaces. The first algorithm requires Lipschitz condition, but its improved step size procedure is given without prior estimate of the Lipschitz constant of the cost operator. The second algorithm with new and improved step sizes does not require Lipschitz condition of the cost operator and its convergence is proved without imposing additional summability conditions on the sequence defining the step size. Weak and norm convergence with Q-linear rate of convergence of the proposed methods are proved under mild conditions. Finally, we study the numerical behaviour of the proposed algorithms and give comparisons with some known methods in the literature.

Original languageEnglish
Pages (from-to)1591-1614
Number of pages24
JournalJournal of Nonlinear and Convex Analysis
Volume22
Issue number8
Publication statusPublished - 2021

Keywords

  • Pseudo-monotone variational inequalities
  • Subgradient extragradient algorithms
  • linear convergence
  • weak

Fingerprint

Dive into the research topics of 'IMPROVED SUBGRADIENT EXTRAGRADIENT METHODS WITH SELF-ADAPTIVE STEPSIZES FOR VARIATIONAL INEQUALITIES IN HILBERT SPACES'. Together they form a unique fingerprint.

Cite this