0

Properties of matrix multiplication - class-XII

Description: properties of matrix multiplication
Number of Questions: 72
Created by:
Tags: maths matrices and determinants algebra matrices business maths matrix
Attempted 0/65 Correct 0 Score 0

If $A=\begin{bmatrix} 1 & 1 & 1 \ 1 & 1 & 1 \ 1 & 1 & 1 \end{bmatrix}$ then $A^n=\begin{bmatrix} 3^{n-1} & 3^{n-1} & 3^{n-1} \ 3^{n-1} & 3^{n-1} & 3^{n-1} \ 3^{n-1} & 3^{n-1} & 3^{n-1} \end{bmatrix}$ , $n \in N$

  1. True

  2. False


Correct Option: A

If $A = \begin{bmatrix}1\ 2\ 3
\end{bmatrix}$ then $AA^{1}$.

  1. $40$

  2. $\begin{bmatrix} 1\ 4\ 3 \end{bmatrix}$

  3. $\begin{bmatrix} 1 & 2 & 3\ 2 & 4 & 6\ 3 & 6 & 9\end{bmatrix}$

  4. None of these


Correct Option: C

If for the matrix $A.A^3=1$, then $A^{-1}=$

  1. $A^2$

  2. $A^3$

  3. $A$

  4. none of these


Correct Option: A

Let $A$ be a square matrix such that $A^2 = A$ and $|A| \neq 0$, then choose the correct option.

(A' represents transpose of matrix A)

  1. $A = A'$

  2. $A = -A'$

  3. $A' =-I$

  4. $A = -I$


Correct Option: A
Explanation:

$A$ is a square matrix such that $A^2 = A$ and $|A| \neq 0$
$\Rightarrow A^{-1}A^2=A^{-1}A=I$
$\Rightarrow A=I$
$\therefore A=A'=I$
Hence, option A.

For two matrices $A$ and $B$, if $AB=0$, then

  1. $A=0$ and $B=0$

  2. $A=0$ or $B=0$

  3. it is not necessary that $A=0$ or $B=0$

  4. all above are false


Correct Option: C
Explanation:

It is not necessary that, if $AB=0$ then $A=0$ or $B=0$


Take example, $A=\begin{bmatrix}0&0\0&1 \end{bmatrix}, B=\begin{bmatrix}0&1\0&0 \end{bmatrix}$

Clearly $AB=\begin{bmatrix}0&0\0&0 \end{bmatrix}$ but $A,B\neq 0$

For any non-singular matrix A, $ \displaystyle A^{-1} $ =

  1. $|A|adj A$

  2. $\dfrac{1}{|A| adj A}$

  3. $\dfrac{adj A}{|A|}$

  4. None of the above


Correct Option: C
Explanation:

Singular matrix is square matrix whose determinant is equal to Zero.

So, non-singular matrix $A$ is a matrix whose determinant is non-zero.
$\implies$ inverse of $A$ i.e. $A^{-1}$ exist.
$\therefore A^{-1}=\dfrac{adj A}{|A|}$ .... $[|A|\neq 0]$

If $A=\begin{bmatrix} \cos { \alpha  }  & -\sin { \alpha  }  \ \sin { \alpha  }  & \cos { \alpha  }  \end{bmatrix}$, $B=\begin{bmatrix} \cos { 2\beta  }  & \sin { 2\beta  }  \ \sin { 2\beta  }  & -\cos { 2\beta  }  \end{bmatrix}$, where 0 < $\beta$ < ${ \pi  }/{ 2 }$, then prove that $BAB=$ ${ A }^{ -1 }$.

  1. True

  2. False


Correct Option: A

Let $A$ be a $3\times 2$ matrix with real entries. Let $H = A(A^{T}A)^{-1}A^{T}$ where $A^{T}$ is the transpose of $A$ and let $I$ be the identity matrix of order $3\times 3$. Then

  1. $H^{2} = I$

  2. $H^{2} = -I$

  3. $H^{2} = H$

  4. $H^{2} = -H$


Correct Option: C
Explanation:

We know that from the propeties of inverse matrices,

${ \left( A.B \right)  }^{ -1 }={ B }^{ -1 }.A^{ -1 }$
$\Longrightarrow H=A{ A }^{ -1 }{ \left( { A }^{ T } \right)  }^{ -1 }{ A }^{ T }=I\ \Longrightarrow H.H={ H }^{ 2 }=I.H=H$

If $A^3 = 0$ then $1 + A + A^2$ is equal to

  1. I + A

  2. $(I + A)^{-1}$

  3. I - A

  4. $(I - A)^{-1}$


Correct Option: D
Explanation:
since the eigenvalue of matrix $A$ is zero

$(I-A) $ is invertible we have

$(I - A) (I + A + A^2) = I - A^3 = I - 0 = I$

$(I - A) (I + A + A^2)  = I$

$\therefore I + A + A^2 = (I - A)^{-1}$

Find the number of all possible ordered sets of two $(n\times n)$ matrices A and B for which $AB-BA=$$I$.

  1. Infinite

  2. $n^2$

  3. $n!$

  4. Zero


Correct Option: D
Explanation:
$AB-BA=I$

Let a $n\times n$ matrix. Take trace of both sides. trace $(AB-BA)=trace \ (I)$

trace $(AB)-$trace $(BA)=n$

$\Rightarrow \ n=0$

so zero possible ordered let of two $(n\times n)$ matrix.

If $\omega$ is the complex cube root of unity, then inverse of $\begin{bmatrix} \omega  & 0 & 0 \ 0 & { \omega  }^{ 2 } & 0 \ 0 & 0 & { \omega  }^{ 2 } \end{bmatrix}$ is

  1. $\begin{bmatrix} -\omega & 0 & 0 \ 0 & { \omega }& 0 \ 0 & 0 & { \omega }^{ 2 } \end{bmatrix}$

  2. $\begin{bmatrix} \omega^{2} & 0 & 0 \ 0 & { \omega }& 0 \ 0 & 0 & 1 \end{bmatrix}$

  3. $\begin{bmatrix} \omega^{3} & 0 & 0 \ 0 & { \omega } & 0 \ 0 & 0 & 1 \end{bmatrix}$

  4. $\begin{bmatrix} \omega & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & { \omega }^{ 2 } \end{bmatrix}$


Correct Option: B
Explanation:
$M=\begin{bmatrix} w & 0 & 0 \\ 0 & { w }^{ 2 } & 0 \\ 0 & 0 & { w }^{ 2 } \end{bmatrix}$
$(M)=w3w^2.w^3$
$=w^3.w^2$
$=w^3(w^3=1)$
$=(1)$

$adjA\Rightarrow \begin{bmatrix} { w }^{ 5 } & 0 & 0 \\ 0 & { w }^{ 4 } & 0 \\ 0 & 0 & { w }^{ 3 } \end{bmatrix}$

${ A }^{ -1 }=\dfrac { 1 }{ 1 } \begin{bmatrix} { w }^{ 2 }.{ w }^{ 3 } & 0 & 0 \\ 0 & { w }.{ w }^{ 3 } & 0 \\ 0 & 0 & \left( { w }^{ 3 } \right)  \end{bmatrix}$

$=\dfrac { 1 }{ 1 } \begin{bmatrix} { w }^{ 2 }.{ w }^{ 3 } & 0 & 0 \\ 0 & { w }.{ w }^{ 3 } & 0 \\ 0 & 0 & \left( { w }^{ 3 } \right)  \end{bmatrix}$
option $B$ is correct

The inverse of the matrix $\begin{bmatrix}1 & 0 & 1\ 0 & 2 & 3\ 1 & 2& 1\end{bmatrix}$ is

  1. $\dfrac {-1}{6} \begin{bmatrix}-4 & 2 & -2\ 3 & 0 & -3\ -2 & -2& 2\end{bmatrix}$

  2. $\dfrac {1}{6} \begin{bmatrix}-4 & 2 & -2\ 3 & 0 & -3\ -2 & -2& 2\end{bmatrix}$

  3. $\begin{bmatrix}-2 & 1 & -1\ 1 & 0 & -1\ -2 & -2& 2\end{bmatrix}$

  4. $\begin{bmatrix}2 & -1 & 1\ -1 & 0 & 1\ 2 & 2& -2\end{bmatrix}$


Correct Option: A

If A =$\left[ \begin{matrix} i \ 0 \end{matrix}\begin{matrix} 0 \ -1 \end{matrix} \right] $, than check whether: ${{\text{A}}^2} =  - {\text{I,(}}{{\text{i}}^2} =  - 1)$

  1. True

  2. False


Correct Option: B
Explanation:

Given $A=\begin{bmatrix} i & 0  \ 0 & -1 \end{bmatrix}$ where $i^{2}=-1$


Taking LHS

$\Rightarrow A^{2}=A.A$

$\begin{bmatrix} i & 0 \ 0 & -1 \end{bmatrix}\begin{bmatrix} i & 0 \ 0 & -1 \end{bmatrix}$ (using matrix multiplication)

$\Rightarrow \begin{bmatrix} { i }^{ 2 }+0 & 0+0 \ 0+0 & 0+1 \end{bmatrix}$

$\Rightarrow \begin{bmatrix} -1 & 0 \ 0 & -1 \end{bmatrix}$ (wihch is not equal to $-I$)

as $I=\begin{bmatrix} -1 & 0 \ 0 & -1 \end{bmatrix}$

$\therefore A^{2}=-I$ is not a valid relation.

If $A = \left[ \begin{array}{l}\cos \theta \,\,\,\,\sin \theta \ - \sin \theta \,\,\,\cos \theta \end{array} \right]$ where $\theta  = \frac{{2\pi }}{{19}}$ then ${A^{2017}} = $

  1. $A$

  2. ${A^3}$

  3. ${A^5}$

  4. $i$


Correct Option: B

If A and B are matrices of the same order, then $\displaystyle :\left ( A+B \right )^{2}= A^{2}+2AB+B^{2}$ is possible, iff

  1. AB= I

  2. BA= I

  3. AB= BA

  4. none of these


Correct Option: C
Explanation:

$\displaystyle :\left ( A+B \right )^{2}=(A+B)(A+B)$
$= A^{2}+AB+BA+B^{2}$
If $AB=BA$ then 
$\left ( A+B \right )^{2}=A^{2}+2AB+B^{2}$


If $A$ and $B$ are any two matices, then  

  1. $AB=BA$

  2. $AB=I$

  3. $AB=0$

  4. $AB$ may or may not be defined


Correct Option: A

If $A^{2}-A+I=0$, then inverse of $A$ is

  1. $A^{-2}$

  2. $A+I$

  3. $I-A$

  4. $A-I$


Correct Option: C
Explanation:
$A^2-A+I=0$

Multiplying$=A^{-1}(0)$

$A^{-1}(A^2-A=I)=A^{-1}(0)$

$A-I+A^{-1}=0$

$A^{-1}=-(A-I)$

$A^{-1}=I-A$.

The matrices $\begin{bmatrix} \cos { \theta  }  & -\sin { \theta  }  \ \sin { \theta  }  & \cos { \theta  }  \end{bmatrix}$ and $\begin{bmatrix} a & 0 \ 0 & b \end{bmatrix}$ commute under multiplication

  1. if $a=b$ or $\theta=n\pi,$ where $n$ is an integer

  2. always

  3. never

  4. if $a\cos { \theta  } \neq b\sin { \theta  } $


Correct Option: A
Explanation:

$\begin{bmatrix} \cos { \theta  }  & -\sin { \theta  }  \ \sin { \theta  }  & \cos { \theta  }  \end{bmatrix}\begin{bmatrix} a & 0 \ 0 & b \end{bmatrix}=\begin{bmatrix} a\cos { \theta  }  & -b\sin { \theta  }  \ a\sin { \theta  }  & b\cos { \theta  }  \end{bmatrix}$   ...(1)

And $\begin{bmatrix} a & 0 \ 0 & b \end{bmatrix}\begin{bmatrix} \cos { \theta  }  & -\sin { \theta  }  \ \sin { \theta  }  & \cos { \theta  }  \end{bmatrix}=\begin{bmatrix} a\cos { \theta  }  & -a\sin { \theta  }  \ b\sin { \theta  }  & b\cos { \theta  }  \end{bmatrix}$   ...(2)
From (1) and (2), we get
$a\sin { \theta  } =b\sin { \theta  } \Rightarrow \left( a-b \right) \sin { \theta  } =0$
either $a=b$ or $\sin { \theta  } =0$
$\Rightarrow\theta=n\pi;n\in Z$

If $A$ and $B$ are two square matrices of order $3 \times  3$ which satisfy $AB = A$ and $BA = B$, then Which of the following is true?

  1. If matrix $A$ is singular, then matrix $B$ is non singular.

  2. If matrix $A$ is nonsingular, then matrix $B$ is singular.

  3. If matrix $A$ is singular, then matrix $B$ is also singular.

  4. Cannot say anything.


Correct Option: C
Explanation:
$A$ and $B$ are two square matrices of order $3\times3$ which satisfy $AB=A$ and $BA=B,$ then if matrix $A$ is singular, then matrix $B$ is also singular .
$A$ singular matrix is that matrix whose determinants is zero and which is non-irreversible.
$\therefore A$ and $B$ both are singular matrices, then only the conditions $AB=A$ and $BA=B$ holds true.
Hence, the answer is if matrix $A$ is singular, then matrix $B$ is also singular.

The multiplication of matrices is distributive with respect to the matrix addition.

State true or false.

  1. True

  2. False


Correct Option: A
Explanation:

It is well known fact that, Multiplication of matrices is distributive with respect to the matrix addition. i.e $ A(B+C)=AB+AC$

The inverse of the matrix $\begin{bmatrix}3 & 5 & 7 \ 2 & -3 & 1 \ 1 & 1 & 2\end{bmatrix}$ is $\begin{bmatrix}7 & -3 & 26 \ 3 & 1 & 11 \ -5 & -2 & 0\end{bmatrix}$.
State true or false.

  1. True

  2. False


Correct Option: A

 In matrices $AB = O$ does not necessarily mean that 

  1. $A=0$

  2. $B=0$

  3. Both $ A = 0$ and $B=0$

  4. all of the above


Correct Option: D
Explanation:

Let $\quad A = \begin{bmatrix}1 & -1 & 1 \ -3 & 2 & -1

\ -2 & 1 & 0\end{bmatrix}$ and $B = \begin{bmatrix}1&2

& 3\2&4&6 \ 1&2 &3\end{bmatrix}$

$\therefore\quad

AB = \begin{bmatrix}1 & -1 & 1 \ -3 & 2 & -1 \ -2

& 1 & 0\end{bmatrix}\times\begin{bmatrix}1&2 &

3\2&4&6 \ 1&2 &3\end{bmatrix}$

$\quad                   = \begin{bmatrix}0&0&0 \ 0&0&0 \ 0&0&0\end{bmatrix} = O$

$\therefore \quad AB = O$
But neither $A = O$ nor $B = O$.

If inverse of $A=\left[ \begin{matrix} 1 & 1 & 1 \ 2 & -1 & -1 \ 1 & -1 & 1 \end{matrix} \right] $ is $\cfrac { -1 }{ 6 } \left[ \begin{matrix} -2 & -2 & 0 \ -3 & 0 & \alpha  \ -1 & 2 & -3 \end{matrix} \right] $ then $\alpha=$

  1. $0$

  2. $-3$

  3. $3$

  4. $2$


Correct Option: C
Explanation:
Given,

$A=\begin{bmatrix}1&1&1\\ 2&-1&-1\\ 1&-1&1\end{bmatrix}$

$A^{-1}=\begin{bmatrix}1&1&1\\ 2&-1&-1\\ 1&-1&1\end{bmatrix}^{-1}$

$=\begin{bmatrix}1&1&1&\mid \:&1&0&0\\ 2&-1&-1&\mid \:&0&1&0\\ 1&-1&1&\mid \:&0&0&1\end{bmatrix}$

$\:R _1\:\leftrightarrow \:R _2$

$=\begin{bmatrix}2&-1&-1&\mid \:&0&1&0\\ 1&1&1&\mid \:&1&0&0\\ 1&-1&1&\mid \:&0&0&1\end{bmatrix}$

$R _2\:\leftarrow \:R _2-\frac{1}{2}\cdot \:R _1$

$R _3\:\leftarrow \:R _3-\frac{1}{2}\cdot \:R _1$

$=\begin{bmatrix}2&-1&-1&\mid \:&0&1&0\\ 0&\frac{3}{2}&\frac{3}{2}&\mid \:&1&-\frac{1}{2}&0\\ 0&-\frac{1}{2}&\frac{3}{2}&\mid \:&0&-\frac{1}{2}&1\end{bmatrix}$

$R _3\:\leftarrow \:R _3+\frac{1}{3}\cdot \:R _2$

$=\begin{bmatrix}2&-1&-1&\mid \:&0&1&0\\ 0&\frac{3}{2}&\frac{3}{2}&\mid \:&1&-\frac{1}{2}&0\\ 0&0&2&\mid \:&\frac{1}{3}&-\frac{2}{3}&1\end{bmatrix}$

$R _3\:\leftarrow \frac{1}{2}\cdot \:R _3$

$R _2\:\leftarrow \:R _2-\frac{3}{2}\cdot \:R _3$

$=\begin{bmatrix}2&-1&-1&\mid \:&0&1&0\\ 0&\frac{3}{2}&0&\mid \:&\frac{3}{4}&0&-\frac{3}{4}\\ 0&0&1&\mid \:&\frac{1}{6}&-\frac{1}{3}&\frac{1}{2}\end{bmatrix}$

$R _1\:\leftarrow \:R _1+1\cdot \:R _3$

$R _2\:\leftarrow \frac{2}{3}\cdot \:R _2$

$=\begin{bmatrix}2&-1&0&\mid \:&\frac{1}{6}&\frac{2}{3}&\frac{1}{2}\\ 0&1&0&\mid \:&\frac{1}{2}&0&-\frac{1}{2}\\ 0&0&1&\mid \:&\frac{1}{6}&-\frac{1}{3}&\frac{1}{2}\end{bmatrix}$

$R _1\:\leftarrow \:R _1+1\cdot \:R _2$

$=\begin{bmatrix}2&0&0&\mid \:&\frac{2}{3}&\frac{2}{3}&0\\ 0&1&0&\mid \:&\frac{1}{2}&0&-\frac{1}{2}\\ 0&0&1&\mid \:&\frac{1}{6}&-\frac{1}{3}&\frac{1}{2}\end{bmatrix}$

$R _1\:\leftarrow \frac{1}{2}\cdot \:R _1$

$=\begin{bmatrix}1&0&0&\mid \:&\frac{1}{3}&\frac{1}{3}&0\\ 0&1&0&\mid \:&\frac{1}{2}&0&-\frac{1}{2}\\ 0&0&1&\mid \:&\frac{1}{6}&-\frac{1}{3}&\frac{1}{2}\end{bmatrix}$

$=\begin{bmatrix}\frac{1}{3}&\tfrac{1}{3}&0\\ \tfrac{1}{2}&0&-\tfrac{1}{2}\\ \tfrac{1}{6}&-\tfrac{1}{3}&\tfrac{1}{2}\end{bmatrix}$

$=-\dfrac{1}{6}\begin{bmatrix}-2 &-2  &0 \\  -3& 0 &3 \\  -1& 2 &-3 \end{bmatrix}$

$\therefore \alpha =3$

Let $\displaystyle A=\begin{pmatrix}1 &2 \3  &4
\end{pmatrix}$ and $\displaystyle B=\begin{pmatrix}a &0 \0  &b \end{pmatrix} a,b \epsilon N.$Then

  1. there cannot exist any B such that $\displaystyle AB = BA $

  2. there exist more than one but finite number of B's such that $\displaystyle AB = BA$

  3. there exists exactly One B such that $\displaystyle AB = BA$

  4. there exist infinitely many B's such that $\displaystyle AB = BA.$


Correct Option: D
Explanation:

$A=\begin{bmatrix} 1 & 2 \ 3 & 4 \end{bmatrix}$ and $B=\begin{bmatrix} a & 0 \ 0 & b \end{bmatrix}$

$AB = \begin{bmatrix} a & 2b \ 3a & 4b \end{bmatrix}$

$BA = \begin{bmatrix} a & 2a \ 3b & 4b \end{bmatrix}$

$AB\quad =\quad BA \Rightarrow a=b$

$\therefore$ there exist infinitely many  $B's$  such that $AB=BA$.

If $A$ is an invertible square matrix then $|A^{-1}| = ?$

  1. $|A|$

  2. $\dfrac {1}{|A|}$

  3. $1$

  4. $0$


Correct Option: B
Explanation:

$AA^{-1} = I\Rightarrow |AA^{-1}| = |I| = 1$
$\Rightarrow |A|\cdot |A^{-1}| = 1\Rightarrow |A^{-1}| = \dfrac {1}{|A|}$.

If $A = \begin{bmatrix} -2& 3\ 1 & 1\end{bmatrix}$ then $|A^{-1}| = ?$

  1. $-5$

  2. $\dfrac {-1}{5}$

  3. $\dfrac {1}{25}$

  4. $25$


Correct Option: B
Explanation:

$AA^{-1} = I\Rightarrow |AA^{-1}| = |I|\Rightarrow |A|\cdot |A^{-1}| = 1\Rightarrow |A^{-1}| = \dfrac {1}{|A|}$.
$|A| = \begin{vmatrix}-2 & 3\ 1 & 1\end{vmatrix} = (-2 - 3) = (-5) \Rightarrow |A^{-1}| = \dfrac {-1}{5}$.

If matrices $A$ and $B$ anticommute then

  1. $AB = BA$

  2. $AB = -BA$

  3. $(AB) = (BA)^{-1}$

  4. None of these


Correct Option: B
Explanation:

If two Square matrices $A$ and $B$ follow property of anti-commutation, then 

 $\Leftrightarrow AB = -BA$.

Let $A$ and $B$ be two $2 \times 2$ matrices. Consider the statements
          $(i)$ $AB =0 \Rightarrow A = 0 :or :B = 0$
         $ (ii)$ $AB =I \Rightarrow A =B^{-1}$
          $(iii)$ $(A + B)^2 = A^2 + 2AB + B^2$

  1. $(i)$ is false, $(ii)$ and $(iii)$ are true

  2. $(i)$ and $(iii)$ are false, $(ii)$ is true

  3. $(i)$ and $(ii)$ are false, $(iii)$ is true

  4. $(ii)$ and $(iii)$ are false, $(i)$ is true


Correct Option: B
Explanation:

(i) is false.


If $A=\begin{bmatrix}0& 1\0 & -1\end{bmatrix}$ and $B =\begin{bmatrix}1& 1\0 & 0\end{bmatrix}$, then

 $AB=\begin{bmatrix}0& 0\0 & 0\end{bmatrix}=0$

Thus,  $AB = 0$ but neither $A = 0$ nor $B = 0$

(iii) is false since matrix multiplication is not commutative.

$(A + B)^2 = A^2 + AB + BA + B^2$

(ii) is true as the product $AB$ is an identity matrix, if and only $B$ is inverse of the matrix $A$.

Hence, option B.

If $A = \begin{bmatrix} 2& -1\ 1 & 3\end{bmatrix}$, then $A^{-1} = ?$

  1. $\begin{bmatrix}\dfrac {3}{7}
    &\dfrac {-1}{7} \
    \dfrac {1}{7} & \dfrac {2}{7}
    \end{bmatrix}$

  2. $\begin{bmatrix}\dfrac {3}{7}
    &\dfrac {1}{7} \
    \dfrac {-1}{7} & \dfrac {2}{7}
    \end{bmatrix}$

  3. $\begin{bmatrix}\dfrac {3}{7}
    &\dfrac {1}{7} \
    \dfrac {1}{7} & \dfrac {2}{7}
    \end{bmatrix}$

  4. None of these


Correct Option: B
Explanation:

$|A| = \begin{vmatrix}2 & -1\ 1 & 3\end{vmatrix} = (6 + 1) = 7\neq 0$
$M _{11} = 3, M _{12} = 1, M _{21} = -1$ and $M _{22} = 2$


$\therefore C _{11} = 3, C _{12} = -1, C _{21} = 1$ and $C _{22} = 2$

$\Rightarrow Adj\ A = \begin{bmatrix} 3& -1\ 1 & 2\end{bmatrix}^T = \begin{bmatrix} 3& 1\ -1 & 2\end{bmatrix}$

$\Rightarrow A^{-1} = \dfrac {1}{|A|} (adj\ A) = \dfrac {1}{7} \begin{bmatrix}3 & 1\ -1 & 2\end{bmatrix} = \begin{bmatrix}\dfrac {3}{7} & \dfrac {1}{7}\ -\dfrac {1}{7} & \dfrac {2}{7}\end{bmatrix}$.

If $A$ and $B$ are invertible square matrices of the same order then $(AB)^{-1} = ?$

  1. $AB^{-1}$

  2. $A^{-1}B$

  3. $A^{-1}B^{-1}$

  4. $B^{-1}A^{-1}$


Correct Option: D

If $A$ and $B$ are two square matrices of the same order and $m$ is a positive integer, then
$(A + B)^m =$ $^mC _0A^m +$ $^mC _1 A^{m -1} B + ^mC _2A^{m-2} B^2 + ... +$ $^mC _{m- 1} AB^{m-1}+$ $^mC _m B^m$ if

  1. $AB =BA$

  2. $AB + BA =0$

  3. $A^m = 0, :B^m = 0$

  4. none of these.


Correct Option: A
Explanation:

Binomial theorem is applicable if and only if $AB=BA$.

Let $A, : B : and : C$ be $2\times 2$ matrices with entries from the set of real numbers. Define $\ast $ as follows: $\displaystyle A\ast B=\frac{1}{2}(AB + BA)$, then

  1. $A\ast B=B\ast A$

  2. $A\ast A=A^2$

  3. $A\ast (B+C)=A\ast B+A\ast C$

  4. $A\ast I=A$


Correct Option: A,B,C,D
Explanation:

$\displaystyle A*1=\frac { 1 }{ 2 } \left( AI+IA \right) =\frac { 1 }{ 2 } \left( A+A \right) =A$

$\therefore A*I=A,\therefore$ (a)holds.
$\displaystyle A*A=\frac { 1 }{ 2 } \left( AA+AA \right) =\frac { 1 }{ 2 } \left( { A }^{ 2 }+{ A }^{ 2 } \right) ={ A }^{ 2 }$
$\therefore$ (b) holds.
$\displaystyle \left( c \right) A*B=\frac { 1 }{ 2 } \left( AB+BA \right) B*A=\frac { 1 }{ 2 } \left( BA+AB \right) =\frac { 1 }{ 2 } \left( AB+BA \right) $
$[\because $ addition is commutative$]$
$\therefore A*B=B*A,\therefore$ (c)holds
$\displaystyle A*\left( B+C \right) =\frac { 1 }{ 2 } \left( A\left( B+C \right) +\left( B+C \right) A \right) =\frac { 1 }{ 2 } \left( AB+CA+BA+CA \right)$
$\displaystyle  =\frac { 1 }{ 2 } \left( AB+BA \right) +\frac { 1 }{ 2 } \left( AC+CA \right) =A*B+A*C$
$\therefore$ (d) holds.

If $A$ and $B$ are square matrices of the same order such that $A^2=A,:B^2=B, :AB = BA = 0$, then

  1. $AB^2=0$

  2. $(A + B)^2 = A + B$

  3. $(A - B)^2 = A - B$

  4. none of these.


Correct Option: A,B
Explanation:

$AB^2 =(AB)B = 0B =0$  

$(A + B)^2 =A^2 +AB + BA + B^2$

$=A+0+0+B =A+B$

Also, $(A +B)^2 = A + B\neq A-B$

If $A^k=0$ for some value of $k$ and $B=1+A+A^2+...+A^{k-1},$ then $B^{-1}$ equal

  1. $I-A$

  2. $I+A$

  3. $I-A^{k-1}$

  4. None of these


Correct Option: A
Explanation:

Let $\displaystyle B=I+A+{ A }^{ 2 }+{ A }^{ 3 }+...+{ A }^{ k-1 }$

$\displaystyle \therefore B\left( I-A \right) =\left( I+A+{ A }^{ 2 }+...+{ A }^{ k-1 } \right) \left( I-A \right) $
$\displaystyle =I-A+A-{ A }^{ 2 }+{ A }^{ 2 }+...+{ A }^{ k-1 }-{ A }^{ k }=I={ A }^{ k }=I\quad \quad \quad \left( \because { A }^{ k }=O \right) $
Hence, $\displaystyle { \left( I-A \right)  }^{ -1 }=I+A+{ A }^{ 2 }+...+{ A }^{ k }-1=B$
$\displaystyle \Rightarrow { B }^{ -1 }=I-A$

Let $A, : B : and : C$ be $2\times 2$ matrices with entries from the set of real numbers. Define $\ast $ as follows:
  $\displaystyle A \ast B=\frac{1}{2}(AB\,'+A'B)$. Which of the given is true?

  1. $A\ast B= B \ast A$

  2. $A\ast A=A^2$

  3. $A\ast (B+C)=A\ast B+A \ast C$

  4. $A\ast I =A+A'$


Correct Option: A,C
Explanation:

$\displaystyle A \ast B=\frac{1}{2}(AB'+A'B)$

1) $\displaystyle B \ast A=\frac{1}{2}(BA'+B'A)=\frac{1}{2}(AB'+A'B)=A \ast B$

2)$\displaystyle A \ast A=\frac{1}{2}(AA'+A'A)$

3)$\displaystyle A \ast (B+C)=\frac{1}{2}(A(B+C)'+A'(B+C))$

                             $=\displaystyle\frac{1}{2}(AB'+A'B)+\frac{1}{2}(AC'+A'C)$

                             $=A\ast B+A \ast C$

4)$\displaystyle A \ast I=\frac{1}{2}(AI'+A'I)=\frac{1}{2}(A+A')$

Hence, options A and C.

Say true or false:

Let A, B be two matrices such that they commute, then $(AB)^n = A^nB^n$.

  1. True

  2. False


Correct Option: A
Explanation:

$A$ and $B$ commute each other then $AB=BA$
${ \left( AB \right)  }^{ n }={ \left( BA \right)  }^{ n }\ \Rightarrow { \left( AB \right)  }^{ n }={ B }^{ n }{ A }^{ n }={ A }^{ n }{ B }^{ n }={ \left( BA \right)  }^{ n }$

If $A$ is a non-singular matrix, then 

  1. ${ A }^{ -1 }$ is symmetric if $A$ is symmetric

  2. ${ A }^{ -1 }$ is skew-symmetric if $A$ is symmetric

  3. $\left| { A }^{ -1 } \right| =\left| A \right| $

  4. $\left| { A }^{ -1 } \right| ={ \left| A \right|  }^{ -1 }$


Correct Option: A,D
Explanation:

Since $\left| A \right| \neq 0$, therefore ${ A }^{ -1 }$ exists.

Now, $A{ A }^{ -1 }=I={ A }^{ -1 }A$
$\Rightarrow \left( A{ A }^{ -1 } \right) '=I'=\left( { A }^{ -1 }A \right) '\Rightarrow \left( { A }^{ -1 } \right) 'A'=I=A'\left( { A }^{ -1 } \right) '\quad \quad \quad \left( \because A'=A \right) $
$\Rightarrow \left( { A }^{ -1 } \right) 'A=I=A\left( { A }^{ -1 } \right) '\Rightarrow { A }^{ -1 }=\left( { A }^{ -1 } \right) '\Rightarrow { A }^{ -1 }$ is symmetric
Also, since $\left| A \right| \neq 0,\therefore { A }^{ -1 }$ exists such that
$A{ A }^{ -1 }=I={ A }^{ -1 }A\Rightarrow \left| A{ A }^{ -1 } \right| =\left| I \right| $
$\Rightarrow \left| A \right| \left| { A }^{ -1 } \right| =1\quad \quad \left( \because \left| AB \right| =\left| A \right| \left| B \right|  \right) $
$\displaystyle \Rightarrow \left| { A }^{ -1 } \right| =\frac { 1 }{ \left| A \right|  } $

The inverse of a skew-symmetric matrix of an odd order is

  1. a symmetric matrix

  2. a skew-symmetric matrix

  3. diagonal matrix

  4. does not exists


Correct Option: D
Explanation:

Let A be a skew-symmeteic matric of order $n.$

By definition $\displaystyle { A }^{ T }=-A$ 
$\displaystyle\Rightarrow \left| { A }^{ T } \right| =\left| -A \right| \Rightarrow \left| A \right| ={ \left( -1 \right)  }^{ n }\left| A \right| \$
$\displaystyle \Rightarrow \left| A \right| =-\left| A \right|\quad\quad[\because $ n is odd $]$
$\displaystyle \Rightarrow 2\left| A \right| =0\Rightarrow\left| A \right| =0$
$\therefore{ A }^{ -1 }$ does not exist. 

If $AB=A$ and $BA=B$, where $A$ and $B$ are square matrices, then 

  1. ${ B }^{ 2 }=B$ and ${ A }^{ 2 }=A$

  2. ${ B }^{ 2 }=A$ and ${ A }^{ 2 }=B$

  3. $AB=BA$

  4. none of these


Correct Option: A
Explanation:

We have 

$AB=A\Rightarrow A\left( BA \right) =A\quad$, subsitute $BA=B$ 
${$ $A(BA) =(AB)A$ $}$
$\Rightarrow \left( AB \right) A=A$
Subsitute $AB = A$
$\Rightarrow AA=A\quad \quad \left[ \therefore AB=A \right] \ \Rightarrow { A }^{ 2 }=A$
Again $BA=B$

$\Rightarrow B\left( AB \right) =B\quad \quad \left[ \because AB=A \right] $
$\Rightarrow \left( BA \right) B=B$
$\Rightarrow BB=B$
$\Rightarrow { B }^{ 2 }=B$

If $A=\begin{bmatrix} 0 & 1 \ 1 & 0 \end{bmatrix}$, $B=\begin{bmatrix} 0 & -i \ i & 0 \end{bmatrix}$ then ${(A+B)}^{2}$ equals

  1. ${A}^{2}+{B}^{2}$

  2. ${A}^{2}+{B}^{2}+2AB$

  3. ${A}^{2}+{B}^{2}+AB-BA$

  4. none of these


Correct Option: A
Explanation:

Given, $A=\begin{bmatrix} 0 & 1 \ 1 & 0 \end{bmatrix},B=\begin{bmatrix} 0 & -i \ i & 0 \end{bmatrix}$


$ A+B=\begin{bmatrix} 0 & 1 \ 1 & 0 \end{bmatrix}+\begin{bmatrix} 0 & -i \ i & 0 \end{bmatrix}=\begin{bmatrix} 0 & 1-i \ i+1 & 0 \end{bmatrix}$

$ { \left( A+B \right)  }^{ 2 }=\begin{bmatrix} 0 & 1-i \ i+1 & 0 \end{bmatrix}\begin{bmatrix} 0 & 1-i \ i+1 & 0 \end{bmatrix}=\begin{bmatrix} 2 & 0 \ 0 & 2 \end{bmatrix}$

$ { A }^{ 2 }=\begin{bmatrix} 0 & 1 \ 1 & 0 \end{bmatrix}\begin{bmatrix} 0 & 1 \ 1 & 0 \end{bmatrix}=\begin{bmatrix} 1 & 0 \ 0 & 1 \end{bmatrix}$

$ { B }^{ 2 }=\begin{bmatrix} 0 & -i \ i & 0 \end{bmatrix}\begin{bmatrix} 0 & -i \ i & 0 \end{bmatrix}=\begin{bmatrix} 1 & 0 \ 0 & 1 \end{bmatrix}$

$ { A }^{ 2 }+{ B }^{ 2 }=\begin{bmatrix} 1 & 0 \ 0 & 1 \end{bmatrix}+\begin{bmatrix} 1 & 0 \ 0 & 1 \end{bmatrix}=\begin{bmatrix} 2 & 0 \ 0 & 2 \end{bmatrix}$

If $D=diag({d} _{1}, {d} _{2}, {d} _{3}........{d} _{n})$, where ${d} _{1}\ne 0$ for all $i=1, 2,.....n$, then ${D}^{-1}$ is equal to

  1. $D$

  2. ${I} _{n}$

  3. diag $({d} _{1}^{-1}, {d} _{2}^{-1}, ........{d} _{n}^{-1})$

  4. None of these


Correct Option: C

If for suitable matrices $A, B$; $AB=A$ and $BA=B$; then ${A}^{2}$ equals-

  1. $I$

  2. $A$

  3. $B$

  4. $0$


Correct Option: B,C
Explanation:

Taking $AB=A$


Multiplying by a matrix $A$

$ABA=A^2$

$\Rightarrow A(BA)=A^2$      (Associative property)

$AB=A^2$      ($\because BA=B$)

$\Rightarrow A=A^2$    ($\because AB=A$)

lf $\mathrm{A}$ is $\left{\begin{array}{lll}
8 & -6 & 2\
-6 & 7 & -4\
2 & -4 & \lambda
\end{array}\right}$  is a singular matrix then  $\lambda =$ 

  1. 3

  2. 4

  3. 2

  4. 5


Correct Option: A
Explanation:

Given, $A=\begin{pmatrix}
8 & -6 & 2\
-6 & 7 & -4\
2 & -4 & \lambda
\end{pmatrix}$ is a singular matrix
So, det A=0
$\therefore $ BY operation of matrix (s),
$det A=8(7 \lambda-16)+6[-6 \lambda + 8]+2[24-14]$
$=56 \lambda - 128 -36 \lambda +48 +20$
$=20 \lambda - 60$
So, $det A = 0= 20 \lambda -60$
$\lambda =3$

If $\left[\begin{array}{ll}
\mathrm{x} & \mathrm{y}^{3}\
2 & 0
\end{array}\right]=\left[\begin{array}{ll}
1 & 8\
2 & 0
\end{array}\right]$, then  $\left[\begin{array}{ll}
\mathrm{x} & \mathrm{y}\
2 & 0
\end{array}\right]^{-1}$ is equal to

  1. $-\dfrac{1}{4}$$\left[\begin{array}{ll}

    0 &-2\

    -2 & 1

    \end{array}\right]$

  2. $\dfrac{2}{4}$$\left[\begin{array}{ll}

    1 & 0\

    0 & 1

    \end{array}\right]$

  3. $\dfrac{1}{4}$$\left[\begin{array}{ll}

    0 & -8\

    -2 & 1

    \end{array}\right]$

  4. $\dfrac{1}{4}\left[\begin{array} \ 1&4 \7 &2 \end{array}\right]$


Correct Option: A

$p=$ $\begin{bmatrix}
0 & x &0 \
 0& 0 & 1
\end{bmatrix}$, then $p^{-1}$=


  1. Not possible to get an inverse

  2. $\begin{bmatrix}

    x & -a &-bx \

    0&1 &0 \

    0&0 &x

    \end{bmatrix}$

  3. $\mathrm{x}$ $\begin{bmatrix}

    x & -a &-bx \

    0&1 &0 \

    0&0 &x

    \end{bmatrix}$

  4. $x^{2} \begin{bmatrix}

    x & -a &-bx \

    0&1 &0 \

    0&0 &x

    \end{bmatrix}$


Correct Option: A
Explanation:
Requirements to have an Inverse

1. The matrix must be square (same number of rows and columns).
2. The determinant of the matrix must not be zero.
It doesn't satisfy the first condition  so inverse of the given matrice $\left[ \begin{matrix} 0 & x & 0 \\ 0 & 0 & 1 \end{matrix} \right] $ is not possible as it is not a square matrice

option A is correct.

A= $\begin{bmatrix}
cos\alpha  & -sin\alpha \
sin\alpha  & cos\alpha
\end{bmatrix}$ ,then find which of the following are correct 
I) A is singular matrix
II) $A^{-1}$=$A^{T}$
III) A is symmetric matrix
IV) $A^{-1}= -A$

  1. only I and II

  2. only II and III

  3. only II

  4. only IV


Correct Option: C
Explanation:

$\left | A \right |= \cos ^{2}\alpha + \sin ^{2}\alpha = 1$
$A^{T}= \begin{bmatrix}\cos\alpha    & \sin \alpha \ -\sin \alpha  & \cos\alpha \end{bmatrix}\neq A$
$A^{-1}= \frac{1}{1}\begin{bmatrix}\cos\alpha    & \sin \alpha \ -\sin \alpha  & \cos\alpha \end{bmatrix}\neq -A$
$\begin{bmatrix}A^{T}= A^{-1}\end{bmatrix}$

If AB=KI where $\displaystyle K\in R$ then $\displaystyle A^{-1}$= _____

  1. B

  2. KB

  3. $\displaystyle \frac{1}{K}B$

  4. $\displaystyle \frac{1}{K^{2}}B$


Correct Option: C
Explanation:
Given $AB=KI\quad K\epsilon R$
i.e., K is constant
Now ${ A }^{ -1 }=\cfrac { I }{ A } $
I is identity matrix
$AB=KI$
$\Rightarrow \cfrac { 1 }{ K } B=\cfrac { I }{ A } \Rightarrow { A }^{ -1 }=\cfrac { 1 }{ K } B$
OPTION C

If A=$\displaystyle \begin{vmatrix} 5 & -3   \ 4 & 2   \end{vmatrix}$ then find $\displaystyle AA^{-1}$

  1. $\displaystyle \begin{vmatrix} 0 & 0 \ 0 & 0 \end{vmatrix}$

  2. $\displaystyle \begin{vmatrix} -1 & 0 \ 0 & -1 \end{vmatrix}$

  3. $\displaystyle \begin{vmatrix} 1 & 0 \ 0 & 1 \end{vmatrix}$

  4. Does not exist


Correct Option: C
Explanation:

For any given square matrix , $ A{A}^{-1} $ is always equal to Identity Matrix $ I $

So, $ A{A}^{-1} = \begin{vmatrix} 1 & 0 \ 0 & 1 \end{vmatrix} $

If $\displaystyle A=\left[ \begin{matrix} \cos { \theta  }  & \sin { \theta  }  \ -\sin { \theta  }  & \cos { \theta  }  \end{matrix} \right] $, then $\displaystyle \underset { n\rightarrow \infty  }{ \lim } \frac { 1 }{ n } { A }^{ n }$ is?

  1. A null matrix

  2. An identity matrix

  3. $\displaystyle \left[ \begin{matrix} 0 & 1 \ -1 & 0 \end{matrix} \right] $

  4. None of these


Correct Option: A
Explanation:
$A=\begin{bmatrix} \cos { \theta  }  & \sin { \theta  }  \\ -\sin { \theta  }  & \cos { \theta  }  \end{bmatrix}\lim _{ n\rightarrow \infty  }{ \cfrac { 1 }{ n } { A }^{ n } } $
${ A }^{ n }={ \begin{bmatrix} \cos { \theta  }  & \sin { \theta  }  \\ -\sin { \theta  }  & \cos { \theta  }  \end{bmatrix} }^{ n }$
${ A }^{ n }={ \begin{bmatrix} \cos { n\theta  }  & \sin { n\theta  }  \\ -\sin { n\theta  }  & \cos { n\theta  }  \end{bmatrix} }$
Now,
$\lim _{ n\rightarrow \infty  }{ \cfrac { 1 }{ n } { A }^{ n } } =\lim _{ n\rightarrow \infty  }{ \cfrac { 1 }{ n }  } { \begin{bmatrix} \cos { n\theta  }  & \sin { n\theta  }  \\ -\sin { n\theta  }  & \cos { n\theta  }  \end{bmatrix} }$
$=\begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}=$Null matrix
Proof for ${ A }^{ n }={ \begin{bmatrix} \cos { n\theta  }  & \sin { n\theta  }  \\ -\sin { n\theta  }  & \cos { n\theta  }  \end{bmatrix} }=P\left( n \right) $
$P\left( n \right) $is true for $n=1$
For $n=k,k\ge 1$
${ A }^{ k }={ \begin{bmatrix} \cos { k\theta  }  & \sin { k\theta  }  \\ -\sin { k\theta  }  & \cos { k\theta  }  \end{bmatrix} }$
${ A }^{ k+1 }={ A }^{ k }A$
$={ \begin{bmatrix} \cos { k\theta  }  & \sin { k\theta  }  \\ -\sin { k\theta  }  & \cos { k\theta  }  \end{bmatrix} }{ \begin{bmatrix} \cos { \theta  }  & \sin { \theta  }  \\ -\sin { \theta  }  & \cos { \theta  }  \end{bmatrix} }$
$\Rightarrow { \begin{bmatrix} \cos { k\theta  }  & \sin { k\theta  }  \\ -\sin { k\theta  }  & \cos { k\theta  }  \end{bmatrix} }$
$\therefore P\left( n \right) $ is true for $n=k+1\left( k\ge 1 \right) $

If A is invertible, then which of the following is not true?

  1. $\displaystyle { A }^{ -1 }={ \left| A \right| }^{ -1 }$

  2. $\displaystyle { \left( { A }^{ 2 } \right) }^{ -1 }={ \left( { A }^{ -1 } \right) }^{ 2 }$

  3. $\displaystyle { \left( { A }^{ ' } \right) }^{ -1 }={ \left( { A }^{ -1 } \right) }^{ ' }$

  4. None of these


Correct Option: A
Explanation:
A is invertible
$\Rightarrow { A }^{ -1 }$ exists
Option A: ${ A }^{ -1 }={ \left| A \right|  }^{ -1 }$
But we cannot write that a matrix and its determinant are both equal
$\therefore $ option A is not true
Option B: ${ \left( { A }^{ 2 } \right)  }^{ -1 }={ \left( { A }^{ -1 } \right)  }^{ 2 }$
This option is true from the property
${ \left( { A }^{ n } \right)  }^{ -1 }={ \left( { A }^{ -1 } \right)  }^{ 2 }$
Option C: ${ \left( { A }^{ -1 } \right)  }^{ 1 }={ \left( { A }^{ 1 } \right)  }^{ -1 }$
Consider $\left( { A }^{ T } \right) { \left( { A }^{ -1 } \right)  }^{ T }={ \left( { A }^{ -1 }A \right)  }^{ T }={ I }^{ T }=I$
Similarly
${ \left( { A }^{ -1 } \right)  }^{ T }{ \left( { A }^{ T } \right)  }={ \left( A{ A }^{ -1 } \right)  }^{ 1 }{ I }^{ 1 }=1$
From $1$ and $2$
${ A }^{ T }{ \left( { A }^{ -1 } \right)  }^{ T }={ \left( { A }^{ -1 } \right)  }^{ T }{ \left( { A }^{ T } \right)  }=I$
$\Rightarrow { A }^{ 1 }$ is multiplicative inverse of ${ \left( { A }^{ -1 } \right)  }^{ 1 }$
$\Rightarrow { \left( { A }^{ T } \right)  }^{ -1 }={ \left( { A }^{ -1 } \right)  }^{ T }$

Which of the following matrices is not invertible?

  1. $\displaystyle \left[ \begin{matrix} 1 & 1 \ 0 & 1 \end{matrix} \right] $

  2. $\displaystyle \left[ \begin{matrix} -1 & -1 \ -1 & 2 \end{matrix} \right] $

  3. $\displaystyle \left[ \begin{matrix} 2 & 3 \ 4 & 6 \end{matrix} \right] $

  4. $\displaystyle \left[ \begin{matrix} 2 & -2 \ 1 & 1 \end{matrix} \right] $


Correct Option: C
Explanation:

A square matrix that is not invertible is called singular matrix in which its determinant is 0.
$\displaystyle \left[\begin{matrix} 2 & 3 \ 4 & 6 \end{matrix}  \right]$ is a non invertible matrix.
$\because$ its determinant is    $12-12 =0$
Option C is correct.

If the matrix $\displaystyle \left[ \begin{matrix} a \ c \end{matrix}\begin{matrix} b \ d \end{matrix} \right] $ is commutative with the matrix $\displaystyle \left[ \begin{matrix} 1 \ 0 \end{matrix}\begin{matrix} 1 \ 1 \end{matrix} \right] $, then

  1. $a=0, b=c$

  2. $b=0, c=d$

  3. $c=0, d=a$

  4. $d=0, a=b$


Correct Option: C
Explanation:

Since matrix $\begin{bmatrix} a & b \ c & d \end{bmatrix}$ is commutative with the matrix $\begin{bmatrix} 1 & 1 \ 0 & 1 \end{bmatrix}$


So,

$\begin{bmatrix} a & b \\ c & d \end{bmatrix}\begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix}=\begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix}\begin{bmatrix} a & b \\ c & d \end{bmatrix}$

$\begin{bmatrix} a & a+b \\ c & c+d \end{bmatrix}=\begin{bmatrix} a+c & b+d \\ c & d \end{bmatrix}$

$a=a+c,\quad a+b=b+d$
$c=0,\quad a=d$

Consider two matrix $A = \begin{bmatrix} 1 & 2\ 2 & 1\ 1 & 1 \end{bmatrix}$ and $ B = \begin{bmatrix} 1 & 2 & -4\  2 & 1 & -4 \end{bmatrix}$. Which one of the following is correct ?

  1. B is the right inverse of A

  2. B is the left inverse of A

  3. B is the both sided inverse of A

  4. None of the above


Correct Option: B
Explanation:

For two $m \times  n$ and $n \times m$ matrices $A$ and $B$

If $AB =I$, then $B$ is right inverse of $A$
If $BA=I$, then $B$ is left inverse of $A$
Note that if $AB=BA=I$, $B$ is both sided inverse of $A$
If $m=n$, which isn't true here
$AB=\begin{bmatrix} 1 & 2 \ 2 & 1 \ 1 & 1 \end{bmatrix}\begin{bmatrix} 1 & 2 & -4 \ 2 & 1 & -4 \end{bmatrix}=\begin{bmatrix} 5 & 4 & -4 \ 4 & 5 & -4 \ 3 & 3 & 0 \end{bmatrix}\neq I$
$BA=\begin{bmatrix} 1 & 2 & -4 \ 2 & 1 & -4 \end{bmatrix}\begin{bmatrix} 1 & 2 \ 2 & 1 \ 1 & 1 \end{bmatrix}=\begin{bmatrix} 1 & 0 \ 0 & 1 \end{bmatrix}=I$
$\therefore$ B is left inverse of A

If $A$ is a square matrix of order $3$ and det $A = 5$, then what is det $[(2A)^{-1}]$ equal to?

  1. $\dfrac{1}{10}$

  2. $\dfrac{2}{5}$

  3. $\dfrac{8}{5}$

  4. $\dfrac{1}{40}$


Correct Option: D
Explanation:

If $A$ is of order $3$ then,$A^-1$ is also of order $3$.

Now, $\text{det} (cA)=c^n(\text{ det} A)$ where $n$ is the order of the matrix.
And, $\text {det } A=\dfrac{1}{\text{det} A^{-1}}$
Thus $\text{det} [(2A)^{-1}]=2^3\text{det}[A]$
and $\text{det}[A^{-1}]=\dfrac{1}{8.5}$
$=\dfrac{1}{40}$

If A is a square matrix such that $A^2 = I $ where I is the identity matrix, then what is $A^{-1}$ equal to ?

  1. A + 1

  2. Null matrix

  3. A

  4. Transpose of A


Correct Option: C
Explanation:
$A\rightarrow $square matrix
${ A }^{ 2 }=I$
To find ${ A }^{ -1 }=?$
Given ${ A }^{ 2 }=I$
$\Rightarrow A\times A=I$
$\Rightarrow A=\cfrac { I }{ A } $
$\Rightarrow A={ A }^{ -1 }$
$\therefore { A }^{ -1 }=A$
Option C is correct

If A is an orthogonal matrix of order 3 and $B=\begin{bmatrix}1&2&3\-3&0&2\2&5&0\end{bmatrix}$, then which of the following is/are correct?
1. $|AB|= \pm 47$
2. $AB=BA$
Select the correct answer using the code given below :

  1. 1 only

  2. 2 only

  3. Both 1 and 2

  4. Neither 1 nor 2


Correct Option: A
Explanation:

Solution:

$A$ is an orthogonal matrix.
$A^2=1$
$|A|^2=1$
$|A|=\pm 1$
$|AB|=\pm|B|$
Now, $|B|=\begin{bmatrix}1&2&3\-3&0&2\2&5&0\end{bmatrix}=47$
$\therefore |AB|=\pm47$
And $AB=\begin{bmatrix}3&7&3\-3&0&2\2&5&0\end{bmatrix}$ and 
$BA=\begin{bmatrix}1&2&3\-3&0&2\2&5&0\end{bmatrix}$
Since, $AB\neq BA$
Hence, A is the correct option.

If A is a non singular matrix satisfying $A=AB-BA$, then which one of the following holds true

  1. $det. B=0$

  2. $B=0$

  3. $det. A=1$

  4. $det(B+I) =det(B-I)$


Correct Option: D
Explanation:

$A=AB-BA$
$AI+BA=AB$
$A(I+B)A=AB$
$\left| I+B \right| \left| A \right| =\left| A \right| \left| B \right| $
$\left| B \right| =\left| I+B \right| \longrightarrow 1$
$A=AB-BA$
$BA=AB-A$
$BA=A(B-I)$
$\left| B \right| \left| A \right| =\left| A \right| \left| B-I \right| $
$\left| B \right| =\left| B-I \right| \longrightarrow 2$
From equation 1 and 2
$\left| B-I \right| =\left| B+I \right| $

If A is a square matrix of order 3,then $|Adj\left( Adj{ A }^{ 2 } \right) |=$

  1. ${ |A| }^{ 2 }$

  2. ${ |A| }^{ 4 }$

  3. ${ |A| }^{ 8 }$

  4. ${ |A| }^{ 16 }$


Correct Option: C
Explanation:

$|adj\left( adj{ A }^{ 2 } \right) |$
$Q={ { |A }^{ 2 }| }^{ { \left( 3-1 \right)  }^{ 2 } }={ { |A }^{ 2 } }|^{ 4 }={ |A| }^{ 8 }$


If $AB=0$ for the matrices
$A=\left[ \begin{matrix} \cos ^{ 2 }{ \theta  }  & \cos { \theta  } \sin { \theta  }  \ \cos { \theta  } \sin { \theta  }  & \sin ^{ 2 }{ \theta  }  \end{matrix} \right] $ and $B=\left[ \begin{matrix} \cos ^{ 2 }{ \phi  }  & \cos { \phi  } \sin { \phi  }  \ \cos { \phi  } \sin { \phi  }  & \sin ^{ 2 }{ \phi  }  \end{matrix} \right] $ then $\theta-\phi $ is

  1. an odd multiple of $\dfrac{\pi}{2}$

  2. an odd multiple of ${\pi}$

  3. an odd even of $\dfrac{\pi}{2}$

  4. $0$


Correct Option: A
Explanation:
$A=\begin{bmatrix} { \cos {  }  }^{ 2 }\theta  & \cos { \theta  } \sin { \theta  }  \\ \cos { \theta  } \sin { \theta  }  & { \sin {  }  }^{ 2 }\theta  \end{bmatrix}$      $B=\begin{bmatrix} { \cos {  }  }^{ 2 }  & \cos { \phi  } \sin { \phi  }  \\ \cos { \phi  } \sin { \phi  }  & { \sin {  }  }^{ 2 }\phi  \end{bmatrix}$

$AB = \begin{bmatrix} { \cos {  }  }^{ 2 }\theta  & \cos { \theta  } \sin { \theta  }  \\ \cos { \theta  } \sin { \theta  }  & { \sin {  }  }^{ 2 }\theta  \end{bmatrix}$  $ \begin{bmatrix} { \cos {  }  }^{ 2 }\phi  & \cos { \phi  } \sin { \phi  }  \\ \cos { \phi  } \sin { \phi  }  & { \sin {  }  }^{ 2 }\phi  \end{bmatrix}$

$=\begin{bmatrix} { \cos {  }  }^{ 2 }\theta \cos^{2}\phi + \cos { \theta  } \sin { \theta  } \cos\phi \sin\phi & {\cos^2\theta \cos\phi \sin\phi+\sin^2\phi \sin\theta \cos\theta} \\ \cos { \theta  } \sin { \theta  } \cos^{2}\phi  + { \sin {  }  }^{ 2 }\theta \cos\phi  \sin\phi  & \cos\theta \sin\theta \cos\phi  { \sin {  }  }^{ 2 }\phi  +\sin^{2}\theta\sin^{2}\phi   \end{bmatrix}$

$= \begin{bmatrix} 0  & 0  \\ 0  & 0  \end{bmatrix}$  

$\Rightarrow $ $\begin{bmatrix} { \cos {  }  }^{ 2 }\theta \cos \phi \cos (\theta -\phi)  & \cos { \theta  } \sin { \phi  } \cos (\theta-\phi) \\ \sin {\phi  } \cos {\phi   }  & { \sin {  }  }^{ 2 }\theta \sin \phi \cos (\theta -\phi)  \end{bmatrix}$ = $\begin{bmatrix} 0  & 0  \\ 0  & 0  \end{bmatrix}$

$\Rightarrow$ $\cos (\theta - \phi )$$\begin{bmatrix} { \cos {  }  }^{ 2 }\theta \cos \phi  & \cos { \theta  } \sin { \phi  }  \\ \sin {\phi  } \cos {\phi   }  & { \sin {  }  }^{ 2 }\theta \sin \phi   \end{bmatrix}$ = $\begin{bmatrix} 0  & 0  \\ 0  & 0  \end{bmatrix}$

$\Rightarrow \cos (\theta-\phi) (\cos \theta \cos \phi \sin \theta \sin \phi - \cos \theta \cos \phi \sin \theta \sin \phi ) = 0$

$\Rightarrow \cos (\theta - \phi) = 0$

$\theta - \phi = (2n+1)\dfrac{\pi}{2}$

i.e an odd multiple of $\dfrac{\pi}{2}$ 
Let $A$ be a matrix of order $2 \times 2$ such that $A^2 = 0$ then $A^2 - (a + d)A + (ad - bc) I$ is equal to
  1. $I$

  2. $0 _{2\times 2}$

  3. $-I$

  4. none of these


Correct Option: B
Explanation:
Given $A=\begin{pmatrix} a & b \\ c & d \end{pmatrix}$

$A^2=\begin{pmatrix} a & b \\ c & d \end{pmatrix}$$\begin{pmatrix} a & b \\ c & d \end{pmatrix}=\begin{pmatrix} a^2+bc & ab+bd \\ ca+cd & bc+d^2 \end{pmatrix}............................(1)$

$-(a+d)A=\begin{pmatrix} -a^2-ad & -ab-bd \\ -ac-cd & -ad-d^2 \end{pmatrix}..............(2)$

$(ad-bc)I=\begin{pmatrix} ad-bc & 0 \\ 0 & ad-bc \end{pmatrix}.............(3)$
Adding 1,2,3 we get,
$A^2-(a+d)A+(ad-bc)I=\begin{pmatrix} a^2+bc-a^2-ad+ad-bc & ab+bd-ab-bd \\ ac+cd-ac-cd & bc+d^2-ad-d^2+ad-bc \end{pmatrix}=\begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix}$

Hence, the value of $A^2-(a+d)A+(ad-bc)I=0 _{2\times 2}$

Let $A$ and $B$ are two matrices such that $AB =BA$, then for every $n\in N$,

  1. $A^nB=BA^n$

  2. $(AB)^n = A^nB^n$

  3. $(A+B)^n=$ $^nC _0A^n+$ $^nC _1A^{n-1}B^1+$ $^nC _2A^{n-2}B^2+ ... + ^nC _n:B^n$.

  4. $A^{2n}-B^{2n}=(A^n-B^n)(A^n+B^n)$


Correct Option: A,B,C,D
Explanation:

$A^2B =A(AB) =A(BA) =(AB)A$

$= (BA)A =BA^2$

Similarly, $A^3B=BA^3$

In general $A^nB=BA^n: \forall : n\geq 1$

(b) and (c) hold as $AB =BA$.

Also, $(A^n -B^n) (A^n + B^n)$

$=A^nA^n-B^n:A^n+A^n:B^n-B^n:B^n$

$=A^{2n}-B^{2n}$

Hence, options A,B,C and D.

If $D _1$ and $D _2$ are two $3\times 3$ diagonal matrices, then

  1. $D _1:D _2$ is diagonal matrix

  2. $D _1:D _2=D _2:D _1$

  3. $D _1^2+D _2^2$ is a diagonal matrix

  4. none of these


Correct Option: A,B,C
Explanation:

Let ${ D } _{ 1 }=\begin{bmatrix} a & 0 & 0 \ 0 & b & 0 \ 0 & 0 & c \end{bmatrix},{ D } _{ 2 }=\begin{bmatrix} x & 0 & 0 \ 0 & y & 0 \ 0 & 0 & z \end{bmatrix}$

Then

${ D } _{ 1 }{ D } _{ 2 }=\begin{bmatrix} a & 0 & 0 \ 0 & b & 0 \ 0 & 0 & c \end{bmatrix}\begin{bmatrix} x & 0 & 0 \ 0 & y & 0 \ 0 & 0 & z \end{bmatrix}=\begin{bmatrix} ax & 0 & 0 \ 0 & by & 0 \ 0 & 0 & cz \end{bmatrix}\$


$ { D } _{ 2 }{ D } _{ 1 }=\begin{bmatrix} x & 0 & 0 \ 0 & y & 0 \ 0 & 0 & z \end{bmatrix}\begin{bmatrix} a & 0 & 0 \ 0 & b & 0 \ 0 & 0 & c \end{bmatrix}=\begin{bmatrix} xa & 0 & 0 \ 0 & yb & 0 \ 0 & 0 & zc \end{bmatrix}$

As $ax=xa,by=yb,cz=zc$

${ D } _{ 1 }{ D } _{ 2 }={ D } _{ 2 }{ D } _{ 1 }$

${ { D } _{ 1 } }^{ 2 }+{ { D } _{ 2 } }^{ 2 }=\begin{bmatrix} a & 0 & 0 \ 0 & b & 0 \ 0 & 0 & c \end{bmatrix}\begin{bmatrix} a & 0 & 0 \ 0 & b & 0 \ 0 & 0 & c \end{bmatrix}+\begin{bmatrix} x & 0 & 0 \ 0 & y & 0 \ 0 & 0 & z \end{bmatrix}\begin{bmatrix} x & 0 & 0 \ 0 & y & 0 \ 0 & 0 & z \end{bmatrix}\$

$ =\begin{bmatrix} { a }^{ 2 } & 0 & 0 \ 0 & { b }^{ 2 } & 0 \ 0 & 0 & { c }^{ 2 } \end{bmatrix}+\begin{bmatrix} { x }^{ 2 } & 0 & 0 \ 0 & { y }^{ 2 } & 0 \ 0 & 0 & { z }^{ 2 } \end{bmatrix}=\begin{bmatrix} { { a }^{ 2 }+x }^{ 2 } & 0 & 0 \ 0 & { { b }^{ 2 }+y }^{ 2 } & 0 \ 0 & 0 & { { c }^{ 2 }+z }^{ 2 } \end{bmatrix}$

if $\begin{bmatrix}2 &1 \ 7 &4 \end{bmatrix}$A$\begin{bmatrix}-3 &2 \ 5 &-3 \end{bmatrix}=\begin{bmatrix}1 &0 \ 0&1 \end{bmatrix}$, then matrix A equals

  1. $\begin{bmatrix}7 &5 \ -11 &-8 \end{bmatrix}$

  2. $\begin{bmatrix}2 & 1 \ 5 & 3 \end{bmatrix}$

  3. $\begin{bmatrix}7 & 34 \ 1 & 5 \end{bmatrix}$

  4. $\begin{bmatrix}5 & 13 \ 3 & 8 \end{bmatrix}$


Correct Option: A
Explanation:

$\begin{bmatrix}2 &1 \ 7 &4 \end{bmatrix}$A$\begin{bmatrix}-3 &2 \ 5 &-3 \end{bmatrix}=\begin{bmatrix}1 &0 \ 0&1 \end{bmatrix}$

$P=\begin{bmatrix}2 &1 \ 7 &4 \end{bmatrix},Q=\begin{bmatrix}-3 &2 \ 5 &-3 \end{bmatrix}, R=\begin{bmatrix}1 & 0 \ 0 & 1 \end{bmatrix}$

$PAQ = R \Rightarrow  A = P^{-1}RQ^{-1}$

$\Rightarrow A=P^{-1}Q^{-1}=(QP)^{-1}$

$QP=\begin{bmatrix}8 &5 \ -11 &-7 \end{bmatrix}$

$\therefore A=(QP)^{-1}=\begin{bmatrix}7 &5 \ -11 &-8 \end{bmatrix}$ 

Hence, option A.

Lets $A=\begin{bmatrix} 0&5 \-5 & 0\end{bmatrix}$ be a skew symmetric matrix and $I + A$ is non singular, then the matrix $B = (I - A)(I + A)^{-1}$ is

  1. an Orthogonal Matrix

  2. an Idempotent Matrix

  3. a Nilpotent Matrix

  4. Data Insufficient


Correct Option: A
Explanation:

$B=(I-A)(I+A)^{-1}$

$B^{T}=[(I+A)^{-1}]^{T}(I-A)^{T}$
$\Rightarrow B^{T}=[(I+A)^{T}]^{-1}(I-A)^{T}$
$\Rightarrow B^{T}=(I-A)^{-1}(I+A)$             since $A^{T}=-A$
$B^{-1}=(I+A)(I-A)^{-1}$
In this case commutativity holds, so,
$B^{T}=B^{-1}\Rightarrow B\text{ is Orthogonal Matrix}$

Consider the following statements :
$S _1$ : If $f(x)$ and $g(x)$ both are discontinuous function and $f(x) + g(x)$ is continuous, then $f(x) - g(x)$ is discontinuous.

$S _2$ : If a tangent to the standard ellipse $\displaystyle \frac{x^2}{a^2}+\displaystyle \frac{y^2}{b^2} = 1$ intersects the principal axis at A and  B, then least value of $AB$ is $(a+b)$.

$S _3$ : If $A$ and $B$ are two matrices such that $AB = O$, where $O$ is null matrix, then at least one of the matrices $A$ and $B$ must be a null matrix.

$S _4$ : If $a,b,c \epsilon R$ and $D$ is a perfect square of a rational number, then both roots of the quadratic equation $a{ x }^{ 2 }+bx+c=0$ are rational.

State, in order, whether ${ S } _{ 1 },{ S } _{ 2 },{ S } _{ 3 }$ or $ { S } _{ 4 }$ are true or false.
  1. FFTT

  2. TTFF

  3. FFFT

  4. TTTF


Correct Option: B
Explanation:

S1 : If $f(x)$ and $g(x)$ both are discontinuous function and $f(x) + g(x)$ is continuous, then $f(x) - g(x)$ is discontinuous.
True
eg: $f(x) = [x]$ and $g(x) = [1-x]$ for non integral values of x, where [.] is a greatest integer function
$f(x)+g(x) = [x]+[1-x] = 0$ is continuous and
$f(x)-g(x)$ is discontinuous.
S2 : If a tangent to the standard ellipse $\displaystyle\frac { x^{ 2 } }{ a^{ 2 } } +\displaystyle\frac { y ^2}{b^2  } =1$  intersects the principal axis at $A$ and  $B$, then least value of $AB$ is $(a+b)$
True
Tanget at $\left( a\cos { \theta , } b\sin { \theta  }  \right)$ is $\displaystyle\frac { x\cos { \theta  }  }{ a } +\displaystyle\frac { y\sin { \theta  }  }{ b } =1$ which intersects axis at $A=(\displaystyle\frac { a }{ \cos { \theta  }  } ,0)$ and $B=(0,\displaystyle\frac { b }{ \sin { \theta  }  })$
$AB^{2} = \displaystyle\frac { a^{ 2 } }{ \cos ^{ 2 }{ \theta  }  } +\displaystyle\frac { b^{ 2 } }{ \sin ^{ 2 }{ \theta  }  } $
AB is minimum at $\theta =\tan ^{ -1 }{ \left(\displaystyle \frac { \sqrt { b }  }{ \sqrt { a }  }  \right)  } $ and minimum value is $a+b$

S3 : If $A$ and $B$ are two matrices such that $AB=O$, where $O$ is null matrix, then at least one of the matrices $A$ and $B$ must be a null matrix.
False
product of two non zero  matrices can be a null matrix.
S4 : If $a,b,c R$ and $D$ is a perfect square of a rational number, then both roots of the quadratic equation $ax^{2}+bx+c=0$ are rational.
False
eg $\sqrt { 3 } x^{ 2 }+\sqrt { 28 } x+\sqrt { 3 } =0$
where $D$ is a perfect square and roots are not rational.


- Hide questions