Two bregman projection methods for solving variational inequality problems in hilbert spaces with applications to signal processing

Lateef Olakunle Jolaoso*, Maggie Aphane, Safeer Hussain Khan

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Studying Bregman distance iterative methods for solving optimization problems has become an important and very interesting topic because of the numerous applications of the Bregman distance techniques. These applications are based on the type of convex functions associated with the Bregman distance. In this paper, two different extragraident methods were proposed for studying pseudomonotone variational inequality problems using Bregman distance in real Hilbert spaces. The first algorithm uses a fixed stepsize which depends on a prior estimate of the Lipschitz constant of the cost operator. The second algorithm uses a self-adaptive stepsize which does not require prior estimate of the Lipschitz constant of the cost operator. Some convergence results were proved for approximating the solutions of pseudomonotone variational inequality problem under standard assumptions. Moreso, some numerical experiments were also given to illustrate the performance of the proposed algorithms using different convex functions such as the Shannon entropy and the Burg entropy. In addition, an application of the result to a signal processing problem is also presented.

Original languageEnglish
Article number2007
Pages (from-to)1-20
Number of pages20
JournalSymmetry
Volume12
Issue number12
DOIs
Publication statusPublished - Dec 2020

Keywords

  • Bregman divergence
  • Popov extragradient
  • Strong convergence
  • Subgradient
  • Variational inequalites
  • Weak convergence

Fingerprint

Dive into the research topics of 'Two bregman projection methods for solving variational inequality problems in hilbert spaces with applications to signal processing'. Together they form a unique fingerprint.

Cite this