Abstract
In this manuscript, we propose an efficient algorithm for solving a class of nonlinear operator equations. The algorithm is an improved version of previously established method. The algorithm’s features are as follows: (i) the search direction is bounded and satisfies the sufficient descent condition; (ii) the global convergence is achieved when the operator is continuous and satisfies a condition weaker than pseudo-monotonicity. Moreover, by comparing it with previously established method the algorithm’s efficiency was shown. The comparison was based on the iteration number required for each algorithm to solve a particular problem and the time taken. Some benchmark test problems, which included monotone and pseudo-monotone problems, were considered for the experiments. Lastly, the algorithm was utilized to solve the logistic regression (prediction) model.
Original language | English |
---|---|
Article number | 3734 |
Journal | Mathematics |
Volume | 12 |
Issue number | 23 |
DOIs | |
Publication status | Published - Dec 2024 |
Externally published | Yes |
Keywords
- conjugate gradient method
- iterative methods
- logistic regression
- pseudo-monotone nonlinear equations