Please use this identifier to cite or link to this item: http//localhost:8080/jspui/handle/123456789/12140
Full metadata record
DC FieldValueLanguage
dc.contributor.authorBAHI, Amina-
dc.date.accessioned2024-10-21T10:43:45Z-
dc.date.available2024-10-21T10:43:45Z-
dc.date.issued2024-06-09-
dc.identifier.urihttp//localhost:8080/jspui/handle/123456789/12140-
dc.description.abstractThe conjugate gradient method is considered one of the most important methods used to speed up the gradient algorithm, for this purpose, several related algorithms have been developed. We will present a new method that accelerates the convergence of the gradient method (the higher slope method) using a new version of the conjugate gradient and a powerful non-exact linear Wolf search. It will be shown that this algorithm generates descent trends and converges globally.en_US
dc.language.isoenen_US
dc.publisherUniversity Larbi Tébessi – Tébessaen_US
dc.subjectexamples without restrictions, gradient method, algorithm, general convergence, linear search, imprecise linear search,gradient method, conjugate gradient method.en_US
dc.titleAcceleration of the convergence of the gradient method by using the conjugate gradienten_US
dc.typeThesisen_US
Appears in Collections:2- رياضيات

Files in This Item:
File Description SizeFormat 
Acceleration of the convergence of the gradient method by using the conjugate gradient.pdf2,86 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Admin Tools