Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

Solve the eigenvalue problem Ax = lambda x for the following 3 times 3 matrix. T

ID: 1893325 • Letter: S

Question

Solve the eigenvalue problem Ax = lambda x for the following 3 times 3 matrix. The numerical entries a ij of the 3 Times 3 matrix A, are those given in the accompanying table. below. Find the eigenvalues; the s of det(A-lambda I)=0. Show your working and explain reasoning . Nb To assist you find the eilenvalues, you might plot the characteristic Polynomial as a function of its parameter, lambda , to locate the likely values of one or more of the s.eg using MATLAb. Find the corresponding eigenvector for each eigenvalue . Show your working and explain your reasoning , and check that all your solutions are eigenvectors of the appropriate matrix . Find the corresponding eigenvectors for each eigenvalue. Show your working and explain your reasoning, and check that all your solutions are eigenvectors of the appropriate matrix.

Explanation / Answer

An eigenvector of a square matrix is a non-zero vector that, when multiplied by the matrix, yields a vector that is parallel to the original. For example, if three-element vectors are seen as arrows in three-dimensional space, an eigenvector of a 3×3 matrix A is an arrow whose direction is either preserved or exactly reversed after multiplication by A. The corresponding eigenvalue determines how the length and sense of the arrow is changed by the operation. Specifically, a non-zero column vector v is a right eigenvector of a matrix A if (and only if) there exists a number ? such that Av = ?v. If the vector satisfies vA = ?v instead, it is said to be a left eigenvector. The number ? is called the eigenvalue corresponding to that vector. An eigenspace of A is the set of all eigenvectors with the same eigenvalue, together with the zero vector.[1] The terms characteristic vector, characteristic value, and characteristic space are also used for these concepts. The prefix eigen- is adopted from the German word eigen for "self. Eigenvectors and eigenvalues depend on the concepts of vectors and linear transformations. In most cases, a vector can be assumed to be a list of real numbers. (For example, a vector can be the three coordinates of a point in three-dimensional space, relative to some Cartesian coordinate system. It helps to think of that point as the tip of an arrow whose tail is at the origin of the coordinate system.) Then a linear transformation for such vectors can be defined by a square matrix with one row and one column for each element of the vector. If we think of a vector x as a matrix with n rows and one column, then the linear operator defined by a matrix A with n rows and columns maps the vector x to the matrix product Ax. That is, where, for each index i, Two vectors x and y are said to be parallel to each other if one is a multiple of the other; that is, if every element of one is the corresponding element of the other times the same scaling factor. For example, the vectors x with elements 1,3,4 and y with elements -20,-60,-80 are parallel, because each element of y is -20 times the corresponding element of x. If x and y are arrows in three-dimensional space, the condition "x is parallel to y" means that their arrows lie on the same straight line, and may differ only in length and direction along that line. In general, multiplication of a non-zero vector x by a square matrix A yields a vector y that is not parallel to x. When these two vectors are parallel—that is, when Ax = ?x for some real number ?—we say that x is an eigenvector of A. In that case, the scale factor ? is the corresponding eigenvalue. In particular, multiplication by a 3×3 matrix A may change both the direction and the magnitude of an arrow in three-dimensional space. However, if the arrow x is an eigenvector of A with eigenvalue ?, the operation may only change its length and either keep its direction, or flip it (make the arrow point in the exact opposite direction). Specifically, the length of arrow will increase if |?| > 1, remain the same if |?| = 1, and decrease it if |?| < 1. Moreover, the direction will be precisely the same if ? > 0, and flipped if ? < 0. If ? = 0, then the length of the arrow becomes zero. [edit]Examples The transformation matrix preserves the direction of vectors parallel to (in blue) and (in violet). The points that lie on the line through the origin, parallel to an eigenvector, remain on the line after the transformation. The vectors in red are not eigenvectors, therefore their direction is altered by the transformation. For the matrix the vector is an eigenvector with eigenvalue 1. Indeed, On the other hand the vector is not an eigenvector, since and this vector is not a multiple of the original vector x. The identity matrix I (whose general element Iij is 1 if i=j, and 0 otherwise) maps every vector to itself. Therefore, every vector is an eigenvector of I, with eigenvalue 1. [edit]General definition The concept of eigenvectors and eigenvalues extends naturally to abstract linear operators on abstract vector spaces. Namely, let V be any vector space with some scalar field K, and let T be a linear transformation mapping V into V. We say that a non-zero vector x of V is an eigenvector of T if (and only if) there is a scalar ? in K such that T(x) = ?x. This equation is called the eigenvalue equation for T. Note that T(x) means the result of applying the operator T to the vector x, while ?x means the product of the scalar ? by x.[3] (Some authors allow x to be zero in the definition of eigenvector.[4] With that choice, the zero vector would be an eigenvector of every linear operator, and any scalar would be an eigenvalue for it.) [edit]Eigenspace and spectrum If x is an eigenvector of T, then any scalar multiple ax of x with nonzero a is also an eigenvector, with the same eigenvalue as x, since T(ax) = aT(x) = a(?x) = ?(ax). Moreover, if x and y are eigenvectors with the same eigenvalue ?, then x+y is also an eigenvector with the same eigenvalue ?. Therefore, the set of all eigenvectors with the same eigenvalue ?, together with the zero vector, is a linear subspace of V, called the eigenspace of ?.[5][6] If that subspace has dimension 1, it is sometimes called an eigenline.[7] The list of eigenvalues of T is sometimes called the spectrum of T. The order of this list is arbitrary, but the number of times that an eigenvalue ? appears is important: it is the dimension of the corresponding eigenspace, that is, the maximum number of linearly independent eigenvectors that have eigenvalue ?. [edit]Eigenvalues and eigenvectors of matrices [edit]Characteristic polynomial Main article: Characteristic polynomial The eigenvalues of a matrix A are precisely the solutions ? to the equation Here det is the determinant of the matrix formed by A - ?I and I is the n×n identity matrix. This equation is called the characteristic equation (or, less often, the secular equation) of A. For example, if A is the following matrix (a so-called diagonal matrix): then the characteristic equation reads The solutions to this equation are the eigenvalues ?i = ai,i (i = 1, ..., n). Proving the aforementioned relation of eigenvalues and solutions of the characteristic equation requires some linear algebra, specifically the notion of linearly independent vectors: briefly, the eigenvalue equation for a matrix A can be expressed as which can be rearranged to If there exists an inverse then both sides can be left-multiplied by it, to obtain x = 0. Therefore, if ? is such that A - ?I is invertible, ? cannot be an eigenvalue. It can be shown that the converse holds, too: if A - ?I is not invertible, ? is an eigenvalue. A criterion from linear algebra states that a matrix (here: A - ?I) is non-invertible if and only if its determinant is zero, thus leading to the characteristic equation. The left-hand side of this equation can be seen (using Leibniz' rule for the determinant) to be a polynomial function in ?, whose coefficients depend on the entries of A. This polynomial is called the characteristic polynomial. Its degree is n, that is to say, the highest power of ? occurring in this polynomial is ?n. At least for small matrices, the solutions of the characteristic equation (hence, the eigenvalues of A) can be found directly. Moreover, it is important for theoretical purposes, such as the Cayley–Hamilton theorem. It also shows that any n×n matrix has at most n eigenvalues. However, the characteristic equation need not have n distinct solutions. In other words, there may be strictly less than n distinct eigenvalues. This happens for the matrix describing the shear mapping discussed below.
Hire Me For All Your Tutoring Needs
Integrity-first tutoring: clear explanations, guidance, and feedback.
Drop an Email at
drjack9650@gmail.com
Chat Now And Get Quote