Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

Optimization Problem, comparing the steepest descent and grdient descent method

ID: 3787170 • Letter: O

Question

Optimization Problem, comparing the steepest descent and grdient descent method using the data points below and the equation solving for the "a." I mostly need help with the MatLab code since there are many ways to do it:

(Computer problem) Implement a MATLAB routine for implementing the steepest descent algorithm for a quadratic problem. Given the following data set XT -4 -3 -2 -1 0 1 2 T3 T4 T5 T6 y 56 35 21 11 3 1 0.5 6 13 28 48 Apply the MATLAB routine to compute a degree two polynomial f(x a2 aux ao to fit the above data set. Compare this code with the fixed step size method

Explanation / Answer

Code for Constant step size Steepest Descent method
%Functions used by the Algorithm:
function r = f1(xk_yk)
format long g
r = (xk_yk(1)^2 - xk_yk(2))^2 + (xk_yk(1) -1)^2;
function p = gradient_f1(xk_yk)
format long g;
p(1) = 4*(xk_yk(1)^2 - xk_yk(2))*xk_yk(1) + 2*(xk_yk(1) -1) ;
p(2) = -2*(xk_yk(1)^2 - xk_yk(2));
function idx = check_min_t_f1(t1,t2,t3,xk_yk)
[m,idx] = min([f1(xk_yk -t1.*gradient_f1(xk_yk))
f1(xk_yk -t2.*gradient_f1(xk_yk)) f1(xk_yk -t3.*gradient_f1(xk_yk))]);

clear,clf,clc
%Gradient methods f1
clear; echo off;
format long g;
% initial point
x0_y0 = [2 2];
% termination point
e = 0.05; N =1000;
% constant step size
t = 0.15;
xk_yk(1,:) = x0_y0 - t.*gradient_f1(x0_y0);
disp(’iteration
x1
x2
f1(x1,x2)
Norm(gradient)’);
for i = 2:N
xk_yk(i,:) = xk_yk(i-1,:) - t.*gradient_f1(xk_yk(i-1,:));

if f1(xk_yk(i,:)) > f1(xk_yk(i-1,:))
disp(’not a descent direction ... exiting’);
disp(sprintf(’The extremum could not be located after %d iterations’,i));

break;
end
disp(sprintf(’%-4d %3.4f %3.4f %3.4f %3.4f’,i,xk_yk(i,1),xk_yk(i,2),
f1(xk_yk(i,:)),norm(gradient_f1(xk_yk(i,:)))));
if norm(gradient_f1(xk_yk(i,:))) < e
disp(’----------------------------------------------------------------------’);
disp(sprintf(’The minima of f1 found after %d iterations,’,i));
disp(sprintf(’using constant step of %f, gradient methods are
x1=%f x2=%f,’,t, xk_yk(i,1),xk_yk(i,2)));
disp(sprintf(’and the min value of function is =%f’, f1(xk_yk(i,:))));
disp(’----------------------------------------------------------------------’);
break;
end
end

n = 30; %# of contours
format long g
[X,Y] = meshgrid(-1:.02:2,-1.4:.02:4);
Z = (X.^2 - Y).^2 + (X - ones(size(X))).^2;
%Then, generate a contour plot of Z.
[C,h] = contour(X,Y,Z,n);
clabel(C,h),xlabel(’x_1’,’Fontsize’,18),ylabel(’x_2’,’Fontsize’,18),
title(sprintf(’f_1(x_1,x_2),Const Stepsize=%1.3f’,t),’Fontsize’,18);
grid on
hold on;
convergence = [x0_y0’ xk_yk’];
%scatter(convergence(1,:),convergence(2,:));
plot(convergence(1,:),convergence(2,:),’-ro’);

Hire Me For All Your Tutoring Needs
Integrity-first tutoring: clear explanations, guidance, and feedback.
Drop an Email at
drjack9650@gmail.com
Chat Now And Get Quote