Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

Write a program to implement steepest descent using line search. Terminate the l

ID: 3605563 • Letter: W

Question

Write a program to implement steepest descent using line search. Terminate the line search when the length of the interval is less than 10-6. Apply the program to Rosenbrock's function

Do all of the above for the quadratic

Use the theory to explain any differences in the performance of the algorithm on these two problems.

Write a program to implement steepest descent using line search. Terminate the line search when the length of the interval is less than 106. Apply the program to Rosen brock's function f(x)-100(2-2- 2 + (1-r.)2. Analytically determine the minimizer for Rosenbrock's function and show that it satis- fies the FONC and SOSC. Consider the rate at which f(xk) approaches the minimum value. Determine if this is consistent with the convergence theory for steepest descent Do all of the above for the quadratic Use the theory to explain any differences in the performance of the algorithm on these two problems

Explanation / Answer

Set the initial step length 0 = 1 and print the step length used by each method at each iteration. First try the initial point x0 = (1.2, 1.2)T and then the more difficult starting point x0 = (1.2, 1)T .

For the steepest descent, when x0 = (1.2, 1.2)T, the convergence is very slow.
At 5000 iterations,
f(x5000) = 8.054 × 106, x5000 = [1.0028, 1.0055]T
The step length remains around 2 × 103.
For the steepest descent, when x0 = (1.2, 1)T, the convergence is much slower. At5000 iterations,
f(x5000) = 1.7153 × 103, x5000 = [1.0414, 1.0844]T
The step length also remains around 2 × 103.
For Newton method, when x0 = (1.2, 1.2)T, the convergence history is as following.
f(x) ||Grad f(x)|| alpha iter
5.800e+00 1.000e+00 1.00e+00 0
3.838e-02 1.252e+02 1.00e+00 1
3.156e-02 3.998e-01 6.56e-01 2
2.708e-03 7.867e+00 1.00e+00 3
6.063e-04 2.153e-01 1.00e+00 4
6.658e-07 1.102e+00 1.00e+00 5
4.433e-11 2.795e-03 1.00e+00 6
3.598e-21 2.979e-04 1.00e+00 7
0.000e+00 2.056e-10 1.00e+00 8
The minimizer is:
(1.000e+00, 1.000e+00)

For Newton method, when x0 = (1.2, 1)T, the convergence history is as following.
f(x) ||Grad f(x)|| alpha iter
2.420e+01 1.000e+00 1.00e+00 0
4.732e+00 2.329e+02 1.00e+00 1
4.533e+00 4.639e+00 1.67e-01 2
3.130e+00 4.473e+01 1.00e+00 3
2.920e+00 5.885e+00 3.87e-01 4
1.969e+00 2.572e+01 1.00e+00 5
1.949e+00 4.139e+00 4.78e-01 6
1.116e+00 1.782e+01 1.00e+00 7
1.082e+00 2.341e+00 4.30e-01 8
5.585e-01 1.370e+01 1.00e+00 9
5.213e-01 1.333e+00 4.78e-01 10
2.309e-01 1.281e+01 1.00e+00 11
2.260e-01 6.652e-01 5.90e-01 12
6.480e-02 1.270e+01 1.00e+00 13
6.432e-02 2.837e-01 7.29e-01 14
7.202e-03 9.338e+00 1.00e+00 15
4.019e-03 8.183e-02 1.00e+00 16
8.828e-06 2.803e+00 1.00e+00 17
7.765e-09 3.897e-03 1.00e+00 18
3.329e-17 3.932e-03 1.00e+00 19
8.090e-28 7.557e-09 1.00e+00 20
The minimizer is:
(1.000e+00, 1.000e+00)

Hire Me For All Your Tutoring Needs
Integrity-first tutoring: clear explanations, guidance, and feedback.
Drop an Email at
drjack9650@gmail.com
Chat Now And Get Quote