Please use this identifier to cite or link to this item:
http//localhost:8080/jspui/handle/123456789/12140
Title: | Acceleration of the convergence of the gradient method by using the conjugate gradient |
Authors: | BAHI, Amina |
Keywords: | examples without restrictions, gradient method, algorithm, general convergence, linear search, imprecise linear search,gradient method, conjugate gradient method. |
Issue Date: | 9-Jun-2024 |
Publisher: | University Larbi Tébessi – Tébessa |
Abstract: | The conjugate gradient method is considered one of the most important methods used to speed up the gradient algorithm, for this purpose, several related algorithms have been developed. We will present a new method that accelerates the convergence of the gradient method (the higher slope method) using a new version of the conjugate gradient and a powerful non-exact linear Wolf search. It will be shown that this algorithm generates descent trends and converges globally. |
URI: | http//localhost:8080/jspui/handle/123456789/12140 |
Appears in Collections: | 2- رياضيات |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Acceleration of the convergence of the gradient method by using the conjugate gradient.pdf | 2,86 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.
Admin Tools