Linear Algebra
Linear algebra is a branch of mathematics that is extremely practical to the real world. Unlike a highly theoretical branch of mathematics, such as real analysis, that focuses mostly on proofs and has little computation, linear algebra has a wonderful balance of theory and computation [16]. Additionally, the connections between the theory and computations are clearly seen. The theory develops tools for making computations that can then be used to solve real-world problems. In fact, it is because of these characteristics that linear algebra is one of my favorite undergraduate math courses.
The study of linear algebra is centered around the idea of solving systems of linear equations [16]. While linear algebra has been developed recently over the past few centuries, there is evidence that suggests the Chinese were working with concepts of linear algebra centuries before, around 200 BC [2]. There are a couple different main theories on how linear algebra was developed. One is the idea that the need for “the development of a coherent, comprehensive characterization of systems of equations and their solutions” was the main factor in the development of linear algebra [2]. The other idea is that “the development of a formal, axiomatic way of algebraically defining relations among and operations on vectors” was the main factor that led to what linear algebra is today [2]. In any case, linear algebra has shaped the world with its variety of applications.
Linear algebra has applications in many branches of mathematics and other fields including biology, business, calculus, chemistry, cryptography, ecology, economics, engineering, geometry, mathematical modeling, physical sciences, statistics, social and behavioral sciences, and even psychology [16]. Because of the techniques that linear algebra provides for solving systems of linear equations, polynomials can be found that fit a collection of data points [16]. Solving these systems of equations can also be applied to network analysis that is used in economics, traffic analysis, and electrical engineering [16]. Matrices, along with the operations that can be performed on them, can create models that predict consumer preferences [16]. They can also be used in cryptography to encode and decode messages and are used in least squares regression analysis in statistics [16]. Determinants have many geometrical applications such as finding area, volume, and equations of lines and planes [16]. Vector spaces and eigenvalues can be applied to linear differential equations in calculus [16]. Eigenvalues can also be used to model population growth [16]. There are many more applications, but hopefully this gives a good idea of just how many areas of the world linear algebra has touched.
There are so many different areas of linear algebra, and each one comes with its own complexities and applications. We are going to begin with the fairly simple topic of solving systems of linear equations with matrices. While this is a simple concept, it is the foundation of how computers can be programmed to solve systems of hundreds of linear equations. We will then take a brief look at determinants, looking at how they can be calculated and introducing some of their many applications. We will then give a general description of vectors and look at vector spaces. We will also demonstrate how to show that a set of vectors is a vector space and introduce the idea of subspaces. We will then look at a special vector subspace, known as the span of a subset $S$ of a vector space $V$. Then, we will see how a spanning set of a vector space connected with the idea of linear independence gives us the basis for the vector space. We will look at row and column spaces of matrices and see how to find the bases for these spaces. We will also look at null spaces and how to find their bases as well. We will then conclude our discussion of linear algebra by looking at eigenvalues and a direct result of eigenvalues known as diagonalization. So, let’s begin our journey into the world of linear algebra.