Matrices Definition, Types, Properties, Examples | Addition and Multiplication of Matrix

Matrices

1. Definition: Rectangular array of mn numbers. Unlike determinants, it has no value.
(A=left[ begin{array}{cccc}{a_{11}} & {a_{12}} & {dots ldots} & {a_{1 n}} {a_{21}} & {a_{22}} & {dots ldots} & {a_{2 n}} {vdots} & {vdots} & {vdots} & {vdots} {a_{m 1}} & {a_{m 2}} & {dots ldots} & {a_{m n}}end{array}right]text {Or}left( begin{array}{cccc}{a_{11}} & {a_{12}} & {dots dots} & {a_{1 n}} {a_{21}} & {a_{22}} & {dots ldots} & {a_{2 n}} {vdots} & {vdots} & {vdots} & {vdots} {a_{m 1}} & {a_{m 2}} & {dots dots} & {a_{m n}}end{array}right))

Abbreviated as: A = [ aij ] 1 ≤ i ≤ m ; 1 ≤ j ≤ n, i denotes the row and j denotes the column is called a matrix of order m × n.

2. Special Type Of Matrices:

  • Row Matrix: A = [ a11 , a12 , …… a1n ] having one row . (1 × n) matrix. (or row vectors)
  • Column Matrix:
    (A=left[ begin{array}{c}{a_{11}} {a_{21}} {vdots} {a_{m 1}}end{array}right])
    having one column. (m × 1) matrix (or column vectors)
  • Zero or Null Matrix: (A = Om×n ) An m × n matrix all whose entries are zero .
    (A=left[ begin{array}{ll}{0} & {0} {0} & {0} {0} & {0}end{array}right])
    is a 3 × 2 null matrix &
    (B=left[ begin{array}{lll}{0} & {0} & {0} {0} & {0} & {0} {0} & {0} & {0}end{array}right])
    is 3 × 3 null matrix
  • Horizontal Matrix: A matrix of order m × n is a horizontal matrix if n > m.
    (left[ begin{array}{llll}{1} & {2} & {3} & {4} {2} & {5} & {1} & {1}end{array}right])
  • Verical Matrix: A matrix of order m × n is a vertical matrix if m > n.
    (left[ begin{array}{ll}{2} & {5} {1} & {1} {3} & {6} {2} & {4}end{array}right])
  • Square Matrix: (Order n)If number of row = number of column ⇒ a square matrix.
    Note:
    (i) In a square matrix the pair of elements aij & aji are called Conjugate Elements .e.g.
    (left( begin{array}{ll}{a_{11}} & {a_{12}} {a_{21}} & {a_{22}}end{array}right))
    (ii) The elements a11 , a22 , a33 , …… ann are called Diagonal Elements . The line along which the diagonal elements lie is called “Principal or Leading” diagonal. The qty Σ aii = trace of the matrice written as , i.e. tA Triangular Matrix Diagonal Matrix denote as ddia (d1 , d2 , ….., dn) all elements except the leading diagonal are zero diagonal Matrix Unit or Identity Matrix
    Note: Min. number of zeros in a diagonal matrix of order n = n(n – 1) “It is to be noted that with square matrix there is a corresponding determinant formed by the elements of A in the same order.”

3. Equality Of Matrices:
Let A = [aij ] & B = [bij ] are equal if ,

  • both have the same order.
  • aij = b1ij for each pair of i & j.

4.Algebra Of Matrices:
Addition:
 A + B = [ aij + bij ] where A & B are of the same type. (order)

  • Addition of matrices is commutative. i.e. A + B = B + A, A = m × n; B = m × n
  • Matrix addition is associative .(A + B) + C = A + (B + C) Note : A , B & C are of the same type.
  • Additive inverse. If A + B = O = B + A A = m × n

5. Multiplication Of A Matrix By A Scalar:
(text { If A }=left[ begin{array}{lll} { { a } } & { { b } } & { { c } } { { b } } & { { c } } & { { a } } { { c } } & { { a } } & { { b } } end{array} right];) (mathrm{k} mathrm{A}=left[ begin{array}{lll}{mathrm{ka}} & {mathrm{kb}} & {mathrm{kc}} {mathrm{kb}} & {mathrm{kc}} & {mathrm{ka}} {mathrm{kc}} & {mathrm{ka}} & {mathrm{kb}}end{array}right])

6.Multiplication Of Matrices: (Row by Column)AB exists if, A = m × n & B = n × p 2 × 3 3 × 3
AB exists , but BA does not ⇒ AB ≠ BA
Note: In the product AB,
(left{begin{array}{l}{mathrm{A}=text { pre factor }} {mathrm{B}=text { post factor }}end{array}right.)
A = (a1 , a2 , …… an) &
(B=left[ begin{array}{c}{b_{1}} {b_{2}} {vdots} {b_{n}}end{array}right])
1 × n n × 1 A B = [a1b1 + a1b2 + …… + anbn]
If A = [ aij ] m × n & B = [ bij] n × p  matrix , then
((mathrm{AB})_{mathrm{ij}}=sum_{mathrm{r}=1}^{mathrm{n}} mathrm{a}_{mathrm{ir}} cdot mathrm{b}_{mathrm{rj}})

Properties Of Matrix Multiplication:

  • Matrix multiplication is not commutative.
    (A =left[ begin{array}{ll}{1} & {1} {0} & {0}end{array}right];) (B=left[ begin{array}{ll}{1} & {0} {0} & {0}end{array}right];) (AB=left[ begin{array}{ll}{1} & {0} {0} & {0}end{array}right];) (mathrm{BA}=left[ begin{array}{ll}{1} & {1} {0} & {0}end{array}right] Rightarrow mathrm{AB} neq mathrm{BA}(text { in general }))
  • (A B=left[ begin{array}{ll}{1} & {1} {2} & {2}end{array}right] left[ begin{array}{cc}{-1} & {1} {1} & {-1}end{array}right]=left[ begin{array}{ll}{0} & {0} {0} & {0}end{array}right])
    ⇒ AB = O ⇒/ A = O or B = O
    Note: IfA and B are two non- zero matrices such that AB = O then A and B are called the divisors of zero. Also if [AB] = O ⇒ |AB| ⇒ |A||B| = 0 ⇒ |A| = 0 or |B| = 0 but not the converse. IfA and B are two matrices such that
    (i) AB = BA ⇒ A and B commute each other
    (ii) AB = – BA ⇒ A and B anti commute each other

3. Matrix Multiplication Is Associative:
If A , B & C are conformable for the product AB & BC, then (A . B) . C = A . (B . C)

4. Distributivity:
(left.begin{aligned} mathrm{A}(mathrm{B}+mathrm{C}) &=mathrm{AB}+mathrm{AC} (mathrm{A}+mathrm{B}) mathrm{C} &=mathrm{AC}+mathrm{BC} end{aligned}right])
Provided A, B & C are conformable for respective products

5. Positive Integral Powers Of A Square Matrix:
For a square matrix A , A² A = (A A) A = A (A A) = A3.
Note that for a unit matrix I of any order, Im = I for all m ∈ N.

6. Matrix Polynomial:
If f (x) = a0xn + a1xn-1 + a2xn-2  + ……… + anx0 then we define a matrix polynomial f(A) = a0An + a1An-1 + a2An-2 + ….. + anIn where A is the given square matrix. If f (A) is the null matrix then A is called the zero or root of the polynomial f (x).
Definitions:

  • Idempotent Matrix: A square matrix is idempotent provided A2 = A.
    Note that An = A ∀ n ≥ 2, n ∈ N.
  • Nilpotent Matrix: A square matrix is said to be nilpotent matrix of order m, m ∈ N, if Am = O , Am-1 ≠ O.
  • Periodic Matrix: A square matrix is which satisfies the relation Ak+1 = A, for some positive integer K, is a periodic matrix. The period of the matrix is the least value of K for which this holds true.
    Note that period of an idempotent matrix is 1.
  • Involutary Matrix: IfA2 = I, the matrix is said to be an involutary matrix.
    Note that A = A-1 for an involutary matrix.

7. The Transpose Of A Matrix: (Changing rows & columns)
Let A be any matrix. Then, A = aij of order m × n
⇒ AT or A′ = [ aij ] for 1 ≤ i ≤ n & 1 ≤ j ≤ m of order n × m
Properties of Transpose of a Matrix:
If AT & BT denote the transpose of A and B ,

  • (A ± B)T = AT ± BT ; note that A & B have the same order.
  • (AB)T = BTAT A & B are conformable for matrix product AB.
  • (AT)T = A
  • (kA)T = kAT k is a scalar.
  • General : (A1, A2, …… An)T = AnT , ……. , A2T , A1T (reversal law for transpose)

8. Symmetric & Skew Symmetric Matrix:
A square matrix A = [ aij] is said to be , symmetric if, aij = aji ∀ i & j (conjugate elements are equal)
(Note A = AT)
Note: Max. number of distinct entries in a symmetric matrix of order n is (frac {n(n+1)}{2})
and skew symmetric if , aij = − aji  ∀ i & j (the pair of conjugate elements are additive inverse of each other) (Note A = –AT ) Hence If A is skew symmetric, then aii = − aii ⇒ aii = 0 ∀ i Thus the digaonal elements of a skew symmetric matrix are all zero , but not the converse.
Properties Of Symmetric & Skew Matrix:

  • Property 1: A is symmetric if AT = A
    s skew symmetric if AT = − A
  • Property 2: A + ATis a symmetric matrix A − AT is a skew symmetric matrix. Consider (A + AT)T = AT + (AT)T = AT + A = A + AT
    A + AT is symmetric.
    Similarly, we can prove that A − AT is skew symmetric.
  • Property 3: The sum of two symmetric matrix is a symmetric matrix and the sum of two skew symmetric matrix is a skew symmetric matrix. Let AT = A; BT = B where A & B have the same order. (A + B)T = A + B Similarly we can prove the other.
  • Property 4: If A & B are symmetric matrices then,
    (a) AB + BA is a symmetric matrix
    (b) AB − BA is a skew symmetric matrix .
  • Property 5: Every square matrix can be uniquely expressed as a sum of a symmetric and a skew symmetric matrix.
    (begin{array}{rl}{mathrm{A}=frac{1}{2}left(mathrm{A}+mathrm{A}^{mathrm{T}}right)+} & {frac{1}{2}left(mathrm{A}-mathrm{A}^{mathrm{T}}right)} {mathrm{P}} & {mathrm{Q}} {text { Symmetric }} & {text { Skew Symmetric }}end{array})

9. Adjoint Of A Square Matrix:
Let
(A=left[a_{i j}right]=left( begin{array}{lll}{a_{11}} & {a_{12}} & {a_{13}} {a_{21}} & {a_{22}} & {a_{23}} {a_{31}} & {a_{32}} & {a_{33}}end{array}right))
be a square matrix and let the matrix formed by the cofactors of [aij ] in determinant
(|mathrm{A}| text { is }=left( begin{array}{lll}{mathrm{C}_{11}} & {mathrm{C}_{12}} & {mathrm{C}_{13}} {mathrm{C}_{21}} & {mathrm{C}_{22}} & {mathrm{C}_{23}} {mathrm{C}_{31}} & {mathrm{C}_{32}} & {mathrm{C}_{33}}end{array}right))
Then,
((text {adj A})=left( begin{array}{ccc}{mathrm{C}_{11}} & {mathrm{C}_{21}} & {mathrm{C}_{31}} {mathrm{C}_{12}} & {mathrm{C}_{22}} & {mathrm{C}_{32}} {mathrm{C}_{13}} & {mathrm{C}_{23}} & {mathrm{C}_{33}}end{array}right))
Theorem: A (adj. A) = (adj. A).A = |A| In , If A be a square matrix of order n.
Note: If A and B are non singular square matrices of same order, then
(i) |adj A| = |A|n-1
(ii) adj (AB) = (adj B) (adj A)
(iii) adj(KA) = Kn-1 (adj A), K is a scalar
Inverse Of A Matrix (Reciprocal Matrix): A square matrix A said to be invertible (non singular) if there exists a matrix B such that, AB = I = BA
B is called the inverse (reciprocal) of A and is denoted by A-1. Thus
A-1 = B ⇔ AB = I = BA . We have,  A . (adj A) = |A| In
A-1 A (adj A) = A-1I|Α|;
In (adj A) = A-1  |A| In
∴ ( A^{-1} = frac {(adj A)}{|A|})
Note: The necessary and sufficient condition for a square matrix A to be invertible is that |A| ≠ 0.
Theorem: IfA & B are invertible matrices ofthe same order , then (AB)-1 = B-1 A-1 . This is reversal law for inverse
Note: (i)If A be an invertible matrix , then AT is also invertible & (AT)-1 = (A-1)T .
(ii) If A is invertible, (a) (A −1) −1 = A ; (b) (Ak)-1 = (A-1)k = A-k, k ∈ N
(iii) IfA is an Orthogonal Matrix. AAT= I = AT
(iv) A square matrix is said to be orthogonal if , A-1= AT.
((v)|A^{-1}|=frac {1}{|A|})

System Of Equation & Criterian For Consistency
Gauss – Jordan Method
x + y + z = 6, x − y + z = 2, 2 x + y − z = 1
or
(left( begin{array}{c}{x+y+z} {x-y+z} {2 x+y-z}end{array}right)=left( begin{array}{l}{6} {2} {1}end{array}right) quad left( begin{array}{ccc}{1} & {1} & {1} {1} & {-1} & {1} {2} & {1} & {-1}end{array}right) left( begin{array}{l}{x} {y} {z}end{array}right)=left( begin{array}{l}{6} {2} {1}end{array}right))
A X = B ⇒ A −1 A X = A −1 B ⇒
(mathrm{X}=mathrm{A}^{-1} mathrm{B}=frac{(text { adj. } mathrm{A}) . mathrm{B}}{|mathrm{A}|})
Note:
(1)If |A| ≠ 0, system is consistent having unique solution
(2)If |A| ≠ 0 & (adj A) . B ≠ O (Null matrix) , system is consistent having unique non − trivial solution.
(3) If |A| ≠ 0 & (adj A) . B = O (Null matrix) , system is consistent having trivial solution
(4) If
Matrices

<!–

–>