Dépôt DSpace/Université Larbi Tébessi-Tébessa

Acceleration of the convergence of the gradient method by using the conjugate gradient

Afficher la notice abrégée

dc.contributor.author BAHI, Amina
dc.date.accessioned 2024-10-21T10:43:45Z
dc.date.available 2024-10-21T10:43:45Z
dc.date.issued 2024-06-09
dc.identifier.uri http//localhost:8080/jspui/handle/123456789/12140
dc.description.abstract The conjugate gradient method is considered one of the most important methods used to speed up the gradient algorithm, for this purpose, several related algorithms have been developed. We will present a new method that accelerates the convergence of the gradient method (the higher slope method) using a new version of the conjugate gradient and a powerful non-exact linear Wolf search. It will be shown that this algorithm generates descent trends and converges globally. en_US
dc.language.iso en en_US
dc.publisher University Larbi Tébessi – Tébessa en_US
dc.subject examples without restrictions, gradient method, algorithm, general convergence, linear search, imprecise linear search,gradient method, conjugate gradient method. en_US
dc.title Acceleration of the convergence of the gradient method by using the conjugate gradient en_US
dc.type Thesis en_US


Fichier(s) constituant ce document

Ce document figure dans la(les) collection(s) suivante(s)

Afficher la notice abrégée