In this video, we will learn some of the special matrices and their properties. These special matrices include symmetric and skew-symmetric matrices. Symmetric matrices are very important for machine learning purposes. They are widely used in other applications too. First of all, let us see what symmetric and skew-symmetric matrices are. A real square matrix A whose aij element is aij is said to be a symmetric matrix if A is equal to A transpose. For example, you have this for our first example here. Here, if you take this matrix and you write it transpose, what is the transpose? You interchange rows and columns. If you interchange rows and columns, this is a first row, 1, 1, minus 3, you write it in terms of column. 1, 1, minus 3. The second row is 1, 2, 2. You write in the column way, 1, 2, 2. Third one is minus 3, 2, 1, so it is minus 3, 2, 1. When you see this matrix and our original matrix A, both are same. We say that A is a symmetric matrix. Now the skew-symmetric matrix is a matrix where A is equal to minus A transpose. See for example, this example B here the second example. If you take B transpose C, what is B transpose? It is 0, minus 2, 1. The first row as this column. The second row is 2, 0, 4, write in the column way 2, 0, 4, minus 1, minus 4, 0. Then you carefully observe this matrix B transpose and B. It is nothing but minus B. We can say that this matrix B is a skew-symmetric matrix. Now, let us try to understand some of the important properties of symmetric and skew-symmetric matrices. The first properties regarding skew-symmetric matrix is that the diagonal entries of a skew-symmetric matrix, all are zero. Why it is zero? Because you will see that what is a skew-symmetric matrix, it is a matrix that A is equal to minus A transpose. How we can write diagonal element of the matrix? Diagonal elements are aii. It should be equal to minus aii, because A is equal to A minus A transpose. This implies 2aii is equal to 0 for all i. This implies aii is equal to 0 for all i. Aii means diagonal elements A1, 1, A 2, 2, and so on. This implies the diagonal elements of a skew-symmetric matrix all are zero. This is the first property. The second property is that the eigenvalues of a symmetric matrix are real, all eigenvalues. All eigenvalues of a symmetric matrix are real. However, we can see that if you have a matrix A of n cross n, then sum of the eigenvalues may be complex also in general. But here we have a symmetric matrix. For a symmetric matrix, all eigenvalues are real. How can we show it? Let us see that we have AX equal to Lambda X, this equation, where X is not equal to 0. If we have this equation, this means for a matrix A, the eigenvalues is Lambda and the corresponding eigenvector is X. Now this A, which is given to us is a symmetric matrix. Take the transpose both the side. First of all, take the complex conjugate both the side. It is A conjugate. A conjugate means you're conjugating all the entries. X conjugate, Lambda conjugate, X conjugate. This implies now you take transpose both the side. It is A conjugate, X conjugate whole transpose is equals to Lambda conjugate X conjugate whole transpose. This further implies X bar transpose, A bar transpose is equal to Lambda bar transpose. It is Lambda bar itself, because it's a scalar quantity and it is X bar transpose. This further implies X bar transpose. Now A bar is A itself, because A is a real matrix. Transpose is A because it's a symmetric matrix. We can write it's equal to A and it is Lambda bar, X bar transpose. Now you can cross multiply with vector X both the sides. That we can do. It is AX lambda bar X bar transpose into X. This further in place because AX equal to lambda X from here, we can substitute X with lambda. X bar transpose lambda X is equal to lambda bar X bar transpose X. This implies lambda X bar transpose X is equal to lambda bar X bar transpose X. Now, what is X bar transpose X? You can easily see if you take X as this vector: X1, X2 up to Xn transpose, then X bar will be X1 bar, X2 bar, and so on Xn bar transpose. Then X bar transpose X will be, when you multiply these two, this will be mod of X1 squared plus mod of X2 squared, and so on mod of Xn squared, which is never zero. Why it is never zero? Because it will be zero only when all X's are zero. If all X’s are zero, this means capital X is zero, and capital X is not equal to zero because it's an eigenvector. This is never zero. We can cancel it from both the sides. This implies lambda is equal to lambda bar and this implies lambda is real. In this way we can see that if we have a real symmetric matrix, then all its eigenvalues are always real. The next property is that eigenvalues of a skew-symmetric matrix are either purely imaginary or zero. This also we can show using the same lines. In this proof, we have to simply replace A transpose by minus A. When you replace A transpose by minus A, then the last equality will be replaced by lambda equal to minus lambda bar. From this, you can easily say that lambda is either zero or purely imaginary. Now the next, which is important property for symmetric matrix, is that if you take any two eigenvectors corresponding to two distinct eigenvalues of a symmetric matrix are orthogonal. What do you mean by orthogonal? Suppose you have two eigenvalues, lambda1 and lambda2. Both are real. Just now we have shown that all the eigenvalues of symmetric matrix are real and they are distinct also. That means lambda1 is not equal to lambda2. Suppose the eigenvector correspond to lambda 1 is X1, and the eigenvector correspond to lambda 2 is X2. That means, it is AX1 equals to lambda 1X1 and AX2 equal to lambda 2X2 and X1 is not equal to zero and x2 is not equal to zero, because these are eigenvectors. We have to show that this X1 and X2, which are the eigenvectors corresponding to lambda1 and lambda2, they are orthogonal. Orthogonal means X1 transpose X2 is equal to zero. That means the dot product of these two vectors is zero. This proof is also very easy to prove from these two conditions, you can simply take the transpose both the sides here in the first equation. It is X1 transpose A transpose is equal to lambda1, X1 transpose. A transpose is A, so we can replace this by A because A is a symmetric matrix. Now you can cross multiply both sides by X2. It is X1 transpose AX2 is equals to lambda1X1 transpose X2. Now this AX2 is lambda 2X2 from here. So it is X1 transpose lambda 2x2, which is equal to lambda1, X1 transpose X2. This implies lambda 2 minus lambda1 times X1 transpose X2 equal to zero. Since lambda2 and lambda1 are not same, they are distinct, this value will never be zero, and this implies X1 transpose X2 equal to zero. Hence, we can say that the eigenvectors corresponding to distinct eigenvalues of a symmetric matrix are orthogonal. In this video, we have seen two special matrices; symmetric and skew-symmetric matrices. We have also seen some important properties of symmetric matrices and skew-symmetric matrices. Symmetric matrices hold some important properties, like all the eigenvalues of symmetric matrices are real and corresponding to distinct eigenvalues. Eigenvectors are orthogonal.