Basis of a Vector Space in Linear Algebra: Definition and Key Theorems
Basis of a Vector Space
Finite dimensional vector space theory, along with the definition of the Basis of a vector space, originated in early studies of linear algebra. The Replacement Theorem provided a systematic approach to determine a basis by replacing redundant vectors from a linear span. These tools have been widely applied in physics, engineering, and data science.
What You Will Learn?
- Definition: Finite Dimensional or Finitely Generated Vector Space
- Definition: Infinite Dimensional Vector Space
- Definition: Basis of a Vector Space
- Theorem-1: There exists a basis for any finitely generated vector space (V,+,⋅) over a field (F,+,⋅).
- Replacement Theorem: Let (V,+,⋅) be a vector space over a field (F,+,⋅) and {α1,α2,…,αn} be a basis of V and also β=i=1∑nci⋅αi be a non zero vector of V where c1,c2,…,cn in F. If cj=0 then {α1,α2,…,αj−1,β,αj+1,…,αn} is also a basis of V.
- Theorem-3: Let (V,+,⋅) be a finite dimensional vector space over a field (F,+,⋅). If {α1,α2,…,αn} is a basis of V then any linearly independent set in V contains at most n-vectors.
- Theorem-4: If (V,+,⋅) is a finite dimensional vector space over a field (F,+,⋅) then any two bases have same number of vectors.
Things to Remember
- Mapping
- Fields
- Vector Space
- Subspace
Introduction
Finite dimensional vector space, Basis of a vector space, and the Replacement Theorem are fundamental concepts in Mathematics, particularly in Linear Algebra. A vector space can be fully characterized by its basis, which serves as the minimal linear span of independent vectors. These concepts play a crucial role in understanding subspaces and solving problems in higher-dimensional spaces.
Finite Dimensional or Finitely Generated Vector Space
Definition:
Let (V,+,⋅) be a vector space over a field (F,+,⋅). V is said to be a finite dimensional or finitely generated vector space if there exists a finite set of vectors in V that generates V.
Infinite Dimensional Vector Space
Definition:
Let (V,+,⋅) be a vector space over a field (F,+,⋅). V is said to be a infinite dimensional vector space if it is not finitely generated vector space.
Basis of a Vector Space
Definition:
Let (V,+,⋅) be a vector space over a field (F,+,⋅) and S⊆V. S is said to be a basis of V if
- S is linearly independent in V
- L(S)=V.
Theorem-1
Statement:
There exists a basis for any finitely generated vector space (V,+,⋅) over a field (F,+,⋅).
Proof:
Given that (V,+,⋅) is a finite dimensional vector space over a field (F,+,⋅).
- Case-1: Let V be a null vector space.
We have V={θ} where θ is the zero vector.
Since L(ϕ)={θ} then ϕ is the basis of V. - Case-2: Let V be a non-null vector space.
Let V={θ}. Then there exists a finite set of vectors that generates V.
Let S={α1,α2,…,αn} such that L(S)=V.- Sub-case-1: S is linearly independent.
Since S is linearly independent and we have L(S)=V then S is a basis of V. - Sub-case-2: S is linearly dependent.
Since S is linearly dependent and we have L(S)=V then
by deletion theorem there exists a proper subset T1 of S such that L(T1)=V.
If T1 is linearly independent then T1 is a basis of V.
If T1 is linearly dependent and L(T1)=V then again by using deletion theorem there exists a proper subset T2 of T1 such that L(T2)=V.
Continuing this process after a finite numbers of steps, we will get a linearly independent set Ti subset of Ti−1 such that L(Ti)=V. Hence Ti is a basis of V.
- Sub-case-1: S is linearly independent.
Hence there exists a basis for any finitely generated vector space (V,+,⋅) over a field (F,+,⋅).
Replacement Theorem
Statement:
Let (V,+,⋅) be a vector space over a field (F,+,⋅) and {α1,α2,…,αn} be a basis of V and also β=i=1∑nci⋅αi be a non zero vector of V where c1,c2,…,cn in F. If cj=0 then {α1,α2,…,αj−1,β,αj+1,…,αn} is also a basis of V.
Proof:
Given that (V,+,⋅) is a vector space over a field (F,+,⋅) and {α1,α2,…,αn}=S (say) is a basis of V and also β=i=1∑nci⋅αi be a non zero vector of V where c1,c2,…,cn in F.
Let cj=0 then cj−1 exists.
We have
⟹⟹⟹β=i=1∑nci⋅αiβ=i=1∑j−1ci⋅αi+cj⋅αj+i=j+1∑nci⋅αicj⋅αj=i=1∑j−1(−ci)⋅αi+β+i=j+1∑n(−ci)⋅αiαj=i=1∑j−1(−cj−1.ci)⋅αi+cj−1⋅β+i=j+1∑n(−cj−1.ci)⋅αi
To prove
- To prove
T is linearly independent.T \pmb{T}
Let i=1∑j−1di⋅αi+dj⋅β+i=j+1∑ndi⋅αi=θ∑ i = 1 j − 1 d i ⋅ α i + d j ⋅ β + ∑ i = j + 1 n d i ⋅ α i = θ \pmb{\displaystyle\sum_{i=1}^{j-1} d_{i}\cdot\alpha_{i}+d_{j}\cdot\beta+\displaystyle\sum_{i=j+1}^{n} d_{i}\cdot\alpha_{i} =\theta}
for some scalars d1,d2,…,dj−1,dj,dj+1,…,dn ind 1 , d 2 , … , d j − 1 , d j , d j + 1 , … , d n \pmb{d_{1},d_{2},…,d_{j-1},d_{j},d_{j+1},…,d_{n}} F. NowF \pmb{F} ⟹⟹⟹i=1∑j−1di⋅αi+dj⋅β+i=j+1∑ndi⋅αi=θi=1∑j−1di⋅αi+dj⋅[i=1∑j−1ci⋅αi+cj⋅αj+i=j+1∑nci⋅αi]+i=j+1∑ndi⋅αi=θ using (1)i=1∑j−1di⋅αi+i=1∑j−1(dj.ci)⋅αi+(dj.cj)⋅αj+i=j+1∑n(dj.ci)⋅αi+i=j+1∑ndi⋅αi=θi=1∑j−1(di+dj.ci)⋅αi+(dj.cj)⋅αj++i=j+1∑n(di+dj.ci)⋅αi=θ Since∑ i = 1 j − 1 d i ⋅ α i + d j ⋅ β + ∑ i = j + 1 n d i ⋅ α i = θ ⟹ ∑ i = 1 j − 1 d i ⋅ α i + d j ⋅ [ ∑ i = 1 j − 1 c i ⋅ α i + c j ⋅ α j + ∑ i = j + 1 n c i ⋅ α i ] + ∑ i = j + 1 n d i ⋅ α i = θ using (1) ⟹ ∑ i = 1 j − 1 d i ⋅ α i + ∑ i = 1 j − 1 ( d j . c i ) ⋅ α i + ( d j . c j ) ⋅ α j + ∑ i = j + 1 n ( d j . c i ) ⋅ α i + ∑ i = j + 1 n d i ⋅ α i = θ ⟹ ∑ i = 1 j − 1 ( d i + d j . c i ) ⋅ α i + ( d j . c j ) ⋅ α j + + ∑ i = j + 1 n ( d i + d j . c i ) ⋅ α i = θ \begin{align*} &\pmb{\displaystyle\sum_{i=1}^{j-1} d_{i}\cdot\alpha_{i}+d_{j}\cdot\beta+\displaystyle\sum_{i=j+1}^{n} d_{i}\cdot\alpha_{i} =\theta}\\ \implies &\pmb{\displaystyle\sum_{i=1}^{j-1} d_{i}\cdot\alpha_{i}+d_{j}\cdot\left[\displaystyle\sum_{i=1}^{j-1} c_{i}\cdot\alpha_{i}+c_{j}\cdot\alpha_{j}+\displaystyle\sum_{i=j+1}^{n} c_{i}\cdot\alpha_{i} \right]+\displaystyle\sum_{i=j+1}^{n} d_{i}\cdot\alpha_{i} =\theta}\text{ using (1)}\\ \implies &\pmb{\displaystyle\sum_{i=1}^{j-1} d_{i}\cdot\alpha_{i}+\displaystyle\sum_{i=1}^{j-1} \left(d_{j}.c_{i}\right)\cdot\alpha_{i}+\left(d_{j}.c_{j}\right)\cdot\alpha_{j}+\displaystyle\sum_{i=j+1}^{n} \left(d_{j}.c_{i}\right)\cdot\alpha_{i} +\displaystyle\sum_{i=j+1}^{n} d_{i}\cdot\alpha_{i} =\theta}\\ \implies &\pmb{\displaystyle\sum_{i=1}^{j-1} \left(d_{i}+d_{j}.c_{i}\right)\cdot\alpha_{i}+\left(d_{j}.c_{j}\right)\cdot\alpha_{j}+ +\displaystyle\sum_{i=j+1}^{n} \left(d_{i}+d_{j}.c_{i}\right)\cdot\alpha_{i} =\theta} \end{align*} S is linearly independent thenS \pmb{S} dj.cj=0⟹dj=0 sinced j . c j = 0 ⟹ d j = 0 \pmb{d_{j}.c_{j}=0\implies d_{j}=0 } cj=0 and,c j ≠ 0 \pmb{c_{j}\ne 0 } di+dj.ci=0⟹di=0 whered i + d j . c i = 0 ⟹ d i = 0 \pmb{d_{i}+d_{j}.c_{i}=0\implies d_{i}=0 } i=1,2,…,j−1,j+1,…,n sincei = 1 , 2 , … , j − 1 , j + 1 , … , n \pmb{i=1,2,…,j-1,j+1,…,n} dj=0.d j = 0 \pmb{d_{j}=0 }
di=0,d i = 0 \pmb{d_{i}=0 } i=1,2,…,j−1,j,j+1,…,n.i = 1 , 2 , … , j − 1 , j , j + 1 , … , n \pmb{i=1,2,…,j-1,j,j+1,…,n}
Hence T is linearly independent.T \pmb{T} - To prove
L(T)=VL ( T ) = V \pmb{L(T)=V}
- To prove
L(T)⊆VL ( T ) ⊆ V \pmb{L(T)\subseteq V}
Since T⊆V thenT ⊆ V \pmb{T\subseteq V} L(T)⊆V.L ( T ) ⊆ V \pmb{L(T)\subseteq V} - To prove
V⊆L(T)V ⊆ L ( T ) \pmb{V\subseteq L(T)}
Let γ∈V. Then for some scalarsγ ∈ V \pmb{\gamma\in V} f1,f2,…,fj−1,fj,fj+1,…,fn inf 1 , f 2 , … , f j − 1 , f j , f j + 1 , … , f n \pmb{f_{1},f_{2},…,f_{j-1},f_{j},f_{j+1},…,f_{n}} F,F \pmb{F} ⟹⟹⟹⟹γ=i=1∑j−1fi⋅αi+fj⋅αj+i=j+1∑nfi⋅αiγ=i=1∑j−1fi⋅αi+fj⋅[i=1∑j−1(−cj−1.ci)⋅αi+cj−1⋅β+i=j+1∑n(−cj−1.ci)⋅αi]+i=j+1∑nfi⋅αi using (2)γ=i=1∑j−1fi⋅αi+i=1∑j−1(−fj.cj−1.ci)⋅αi+(fj.cj−1)⋅β+i=j+1∑n(−fj.cj−1.ci)⋅αi+i=j+1∑nfi⋅αiγ=i=1∑j−1(fi−fj.cj−1.ci)⋅αi+(fj.cj−1)⋅β+i=j+1∑n(fi−fj.cj−1.ci)⋅αiγ∈L(T) Soγ = ∑ i = 1 j − 1 f i ⋅ α i + f j ⋅ α j + ∑ i = j + 1 n f i ⋅ α i ⟹ γ = ∑ i = 1 j − 1 f i ⋅ α i + f j ⋅ [ ∑ i = 1 j − 1 ( − c j − 1 . c i ) ⋅ α i + c j − 1 ⋅ β + ∑ i = j + 1 n ( − c j − 1 . c i ) ⋅ α i ] + ∑ i = j + 1 n f i ⋅ α i using (2) ⟹ γ = ∑ i = 1 j − 1 f i ⋅ α i + ∑ i = 1 j − 1 ( − f j . c j − 1 . c i ) ⋅ α i + ( f j . c j − 1 ) ⋅ β + ∑ i = j + 1 n ( − f j . c j − 1 . c i ) ⋅ α i + ∑ i = j + 1 n f i ⋅ α i ⟹ γ = ∑ i = 1 j − 1 ( f i − f j . c j − 1 . c i ) ⋅ α i + ( f j . c j − 1 ) ⋅ β + ∑ i = j + 1 n ( f i − f j . c j − 1 . c i ) ⋅ α i ⟹ γ ∈ L ( T ) \begin{align*} &\pmb{ \gamma = \displaystyle\sum_{i=1}^{j-1} f_{i}\cdot\alpha_{i}+f_{j}\cdot\alpha_{j}+\displaystyle\sum_{i=j+1}^{n} f_{i}\cdot\alpha_{i} } \\ \implies &\pmb{ \gamma = \displaystyle\sum_{i=1}^{j-1} f_{i}\cdot\alpha_{i}+f_{j}\cdot \left[ \displaystyle\sum_{i=1}^{j-1} \left(-c^{-1}_{j}. c_{i}\right)\cdot\alpha_{i}+c^{-1}_{j}\cdot\beta+\displaystyle\sum_{i=j+1}^{n} \left(-c^{-1}_{j}. c_{i}\right)\cdot\alpha_{i}\right]+\displaystyle\sum_{i=j+1}^{n} f_{i}\cdot\alpha_{i} } \text{ using (2)}\\ \implies &\pmb{ \gamma = \displaystyle\sum_{i=1}^{j-1} f_{i}\cdot\alpha_{i}+ \displaystyle\sum_{i=1}^{j-1} \left(-f_{j}.c^{-1}_{j}. c_{i}\right)\cdot\alpha_{i}+\left(f_{j}.c^{-1}_{j}\right)\cdot\beta+\displaystyle\sum_{i=j+1}^{n} \left(-f_{j}.c^{-1}_{j}. c_{i}\right)\cdot\alpha_{i}+\displaystyle\sum_{i=j+1}^{n} f_{i}\cdot\alpha_{i} } \\ \implies &\pmb{ \gamma = \displaystyle\sum_{i=1}^{j-1}\left( f_{i}-f_{j}.c^{-1}_{j}. c_{i}\right)\cdot\alpha_{i}+\left(f_{j}.c^{-1}_{j}\right)\cdot\beta+\displaystyle\sum_{i=j+1}^{n} \left( f_{i}-f_{j}.c^{-1}_{j}. c_{i}\right)\cdot\alpha_{i} } \\ \implies &\pmb{ \gamma\in L(T) } \\ \end{align*} V⊆L(T).V ⊆ L ( T ) \pmb{V\subseteq L(T)}
L(T)=V.L ( T ) = V \pmb{ L(T)=V} - To prove
Hence
Theorem-3
Statement:
Let
Proof:
Given that
Let
To prove
There exists scalars
Then there exists scalars
Then
A contradiction, since
Therefore our assumption is wrong. So,
Let
Continuing in this way, after a finite number of steps, the following cases may arise
- If
r<nr < n \pmb{r\lt n }
In this case, after r-steps, a new basis will form having allr \pmb{r} β’s and the remainingβ \pmb{\beta } (n−r) vectors are( n − r ) \pmb{(n-r)} α’s.α \pmb{\alpha } - If
r=nr = n \pmb{r= n }
In this case, after n-steps, a new basis will form with all then \pmb{n} α’s are replaced with allα \pmb{\alpha } β’s.β \pmb{\beta } - If
r>nr > n \pmb{r\gt n }
In this case, after n-steps, a new basisn \pmb{n} {β1,β2,…,βn} will form with all the{ β 1 , β 2 , … , β n } \pmb{\{\beta_{1},\beta_{2},…,\beta_{n}\}} α’s are replaced withα \pmb{\alpha } β’s but there stillβ \pmb{\beta } {βn+1,βn+2,…,βr} vectors are not present in the new basis.{ β n + 1 , β n + 2 , … , β r } \pmb{\{\beta_{n+1},\beta_{n+2},…,\beta_{r}\}}
Then there exists scalars e1,e2,…,en ine 1 , e 2 , … , e n \pmb{e_{1},e_{2},…,e_{n}} F such thatF \pmb{F} βn+1=i=1∑nei⋅βiβ n + 1 = ∑ i = 1 n e i ⋅ β i \begin{align*}\pmb{\beta_{n+1}=\displaystyle\sum_{i=1}^{n} e_{i}\cdot\beta_{i}} \end{align*} ⟹{β1,β2,…,βn,βn+1} is linearly dependent.⟹ { β 1 , β 2 , … , β n , β n + 1 } \pmb{\implies \{\beta_{1},\beta_{2},…,\beta_{n},\beta_{n+1}\}}
A contradiction, since {β1,β2,…,βr} is a linearly independent set and{ β 1 , β 2 , … , β r } \pmb{\{\beta_{1},\beta_{2},…,\beta_{r}\}} {β1,β2,…,βn,βn+1}⊂{β1,β2,…,βr}.{ β 1 , β 2 , … , β n , β n + 1 } ⊂ { β 1 , β 2 , … , β r } \pmb{\{\beta_{1},\beta_{2},…,\beta_{n},\beta_{n+1}\}\subset \{\beta_{1},\beta_{2},…,\beta_{r}\}}
Therefore r≯n.r ≯ n \pmb{r\ngtr n }
Hence any linearly independent set in
Theorem-4
Statement:
If
Proof:
Given that
Let
Now we have
Therefore
Therefore
Applications
- Dimensionality Reduction
Identifies minimal sets of vectors in machine learning. - Signal Analysis
Used in signal processing to decompose signals into independent components. - Quantum Mechanics
Describes states in a vector space using orthonormal bases. - System of Equations
Simplifies solving systems in linear algebra by utilizing bases.
Conclusion
The Basis of a vector space and the Replacement Theorem provide clarity in understanding the structure of Finite dimensional vector spaces. These concepts are foundational in Linear Algebra and have widespread applications across various disciplines.
References
- Linear Algebra Done Right by Sheldon Axler
- Introduction to Linear Algebra by Gilbert Strang
- Linear Algebra by Serge Lang
Related Articles
- Mappings
- Binary Compositions
- Vector Space
- Linear Transformations
FAQs
- What is a finite dimensional vector space?
A vector space with a finite number of basis vectors. - What is the basis of a vector space?
A set of linearly independent vectors that spans the entire vector space. - What is the Replacement Theorem?
The theorem states that redundant vectors in a spanning set can be replaced to form a basis. - How is a basis constructed?
A basis is constructed by removing linearly dependent vectors from a linear span. - What is the significance of a basis?
A basis uniquely defines the structure of a vector space. - How does the Replacement Theorem work?
It replaces redundant vectors with linearly independent ones without altering the subspace. - What are generating sets?
Generating sets span a subspace, but they may not be independent. - How is a basis used in applications?
Bases simplify problems in optimization and data analysis. - Can there be multiple bases for a vector space?
Yes, but all bases of a finite dimensional vector space have the same size. - What is the connection between linear independence and basis?
Every basis consists of linearly independent vectors.
Related Questions
Trending Today
Categories
Related Notes
Related Questions
Related Quizzes
No posts found!