## Abstract

Studying Bregman distance iterative methods for solving optimization problems has become an important and very interesting topic because of the numerous applications of the Bregman distance techniques. These applications are based on the type of convex functions associated with the Bregman distance. In this paper, two different extragraident methods were proposed for studying pseudomonotone variational inequality problems using Bregman distance in real Hilbert spaces. The first algorithm uses a fixed stepsize which depends on a prior estimate of the Lipschitz constant of the cost operator. The second algorithm uses a self-adaptive stepsize which does not require prior estimate of the Lipschitz constant of the cost operator. Some convergence results were proved for approximating the solutions of pseudomonotone variational inequality problem under standard assumptions. Moreso, some numerical experiments were also given to illustrate the performance of the proposed algorithms using different convex functions such as the Shannon entropy and the Burg entropy. In addition, an application of the result to a signal processing problem is also presented.

Original language | English |
---|---|

Article number | 2007 |

Pages (from-to) | 1-20 |

Number of pages | 20 |

Journal | Symmetry |

Volume | 12 |

Issue number | 12 |

DOIs | |

Publication status | Published - Dec 2020 |

## Keywords

- Bregman divergence
- Popov extragradient
- Strong convergence
- Subgradient
- Variational inequalites
- Weak convergence