Evanalysis
MATH1030

MATH1030: Linear algebra I

Rigorous linear algebra notes on systems, matrices, structure, and proof, with interaction used only where it clarifies the mathematics.

Use the sidebar to move chapter by chapter, or jump directly into a section below.

9 Chapter
Each section stays readable on the page and exports as a static study copy when you need an offline version.

Course contents

MATH1030: Linear algebra I

Rigorous linear algebra notes on systems, matrices, structure, and proof, with interaction used only where it clarifies the mathematics.

Chapter 1Systems of equations1 sections
Chapter 4Solution structure1 sections
Chapter 5Invertibility1 sections

Chapter 1

Systems of equations

Learn to read equations as full solution sets.

1.1Embedded interaction

1.1 Equations and solution sets

Read a linear system as a collection of conditions and describe its full solution set carefully.

Chapter 2

Matrices and elimination

Build matrix intuition and use row reduction with purpose.

2.1Embedded interaction

2.1 Matrix basics

Build matrix intuition before you row-reduce: size, entries, rows, columns, and arithmetic meaning.

2.2Embedded interaction

2.2 Augmented matrices and row operations

Translate a system into an augmented matrix and understand what each row operation preserves.

2.3Embedded interaction

2.3 Gaussian elimination and RREF

See Gaussian elimination as a sequence of purposeful moves, not just memorized mechanics.

2.4Embedded interaction

2.4 Solution-set types

Classify whether a system has one solution, infinitely many solutions, or no solution by reading its reduced form.

Chapter 3

Matrix algebra

Matrix multiplication, transpose, and structural matrix notation.

3.1Embedded interaction

3.1 Matrix multiplication and identity matrices

Learn when matrix products are defined, how the row-by-column rule works, and why the identity matrix matters for solving linear systems.

3.2

3.2 Transpose and special matrices

Use transpose, symmetry, commuting products, and block notation to read matrix structure rather than treating formulas as isolated tricks.

Chapter 4

Solution structure

Homogeneous systems, null spaces, and the shape of full solution sets.

4.1

4.1 Homogeneous systems and null space

Study homogeneous systems carefully, then use null spaces to describe every solution as a structured set rather than a loose list of examples.

Chapter 5

Invertibility

Understand when a matrix can be undone and why that matters.

5.1Embedded interaction

5.1 Invertible matrices

Connect inverse matrices, row reduction, and the practical meaning of nonsingularity.

Chapter 6

Vector spaces

Move from matrix procedures to the structure of spaces, span, independence, and basis.

6.1

6.1 Vector spaces

Start from familiar examples and learn what the vector-space axioms are trying to protect.

6.2Embedded interaction

6.2 Subspaces

Use the subspace test to separate genuine linear structure from lookalikes that fail closure or miss the zero vector.

6.3Embedded interaction

6.3 Linear combinations and span

Treat linear combinations as controlled building instructions, then see span as every vector you can build that way.

6.4Embedded interaction

6.4 Linear dependence and independence

Read dependence as redundancy, and independence as the point where every coefficient truly matters.

6.5Embedded interaction

6.5 Basis and dimension

See why a basis is the smallest complete coordinate system for a space, and why dimension counts how many directions are really needed.

6.6

6.6 Column space, row space, and rank

Use row reduction and basis ideas together to read column space, row space, and rank without confusing what row operations actually preserve.

Chapter 7

Determinants

Determinants, cofactor formulas, and the structural algebra that connects row operations, transpose, and invertibility.

7.1

7.1 Determinants and cofactor expansion

Define determinants carefully through minors and cofactors, then learn how cofactor expansion turns one scalar into a precise summary of square-matrix structure.

7.2

7.2 Row operations, products, and invertibility

Track exactly how row operations change determinants, then connect that behavior to multiplicativity, inverse matrices, and invertibility tests.

7.3

7.3 Transpose, column operations, and Cramer's rule

Use transpose and column operations to read determinants from a second angle, then finish the chapter with adjoints, inverse formulas, and Cramer's rule.

Chapter 8

Eigenvalues and diagonalization

Eigenvalues, eigenspaces, similarity, and diagonalization as the next structural layer after determinants.

8.1

8.1 Eigenvalues, eigenvectors, and eigenspaces

Define eigenvalues through the equation Av=λv, then recast the same idea as a null-space and determinant question so the structure becomes computable.

8.2

8.2 Diagonalization and similarity

Treat diagonalization as a basis change built from eigenvectors, then use similarity to explain when a matrix can be simplified without changing its essential eigenvalue data.

8.3

8.3 Characteristic polynomials and diagonalization tests

Use characteristic polynomials, algebraic and geometric multiplicity, and the distinct-eigenvalue test to decide when eigenvalue data is enough for diagonalization.

Chapter 9

Inner products and orthogonality

Inner products, orthogonality, orthonormal bases, and Gram-Schmidt as the geometric layer after eigenvalues.

9.1

9.1 Inner products, norms, and angles

Define the standard inner product and norm on R^m, then connect those formulas to length, angle, and the first structural inequalities.

9.2

9.2 Orthogonal sets and orthonormal bases

Use orthogonality to build orthogonal and orthonormal bases, then read coefficients without solving a linear system every time.

9.3

9.3 Gram-Schmidt orthogonalization

Apply Gram-Schmidt to turn a basis into an orthogonal or orthonormal basis while preserving the same span.

9.4

9.4 Cauchy-Schwarz and triangle inequalities

Study Cauchy-Schwarz and triangle inequalities as the two core estimates that control length, angle, and equality cases in inner-product spaces.