๐ŸŽ“ Matrix Learning Lab

Interactive visualizations and explorations for linear algebra โ€” from first principles to matrix decompositions

๐ŸŽฏ What Are Matrices?

Welcome to your interactive journey into linear algebra! Let's start with the absolute basics and build up from there.

Part 1: A Matrix is Just a Grid of Numbers

Think of it like a table or spreadsheet:

โ”‚  1   2  โ”‚
โ”‚  3   4  โ”‚

This is a 2ร—2 matrix (2 rows, 2 columns).
We read it as "2 by 2".

How to Read Matrix Notation

Matrix A = โ”‚  a   b  โ”‚
           โ”‚  c   d  โ”‚

Entry notation:
โ€ข Aโ‚โ‚ = a (row 1, col 1)
โ€ข Aโ‚โ‚‚ = b (row 1, col 2)
โ€ข Aโ‚‚โ‚ = c (row 2, col 1)
โ€ข Aโ‚‚โ‚‚ = d (row 2, col 2)

Here's the magic: Matrices transform vectors into new vectors. Imagine a matrix as a machine:

๐ŸŽฏ Input Vector โ†’ Matrix Machine โ†’ ๐ŸŽฏ Output Vector

The matrix "transforms" the input into something new!

๐Ÿ“ Simple Example:

Let's see how a matrix transforms a vector (point):

Matrix    โ”‚ 2  0 โ”‚    Input      โ”‚ 3 โ”‚
          โ”‚ 0  3 โ”‚  ร— Vector   โ”‚ 2 โ”‚

Result = (2ร—3, 0ร—3+3ร—2) = โ”‚ 6 โ”‚  โ† Output Vector
                          โ”‚ 6 โ”‚
What happened?
The point (3, 2) was transformed to (6, 6):
โ€ข x-coordinate: 2 ร— 3 + 0 ร— 2 = 6 (doubled)
โ€ข y-coordinate: 0 ร— 3 + 3 ร— 2 = 6 (tripled)
The matrix stretched the point along both axes!

Real-World Applications

๐ŸŽฎ Computer Graphics

Rotate, scale, and move 3D game objects. Every rotation in a video game uses matrices!

๐Ÿ“ฑ Image Filters

Instagram filters? Matrices! They transform pixel colors to create effects like blur, sharpen, and edge detection.

๐Ÿค– Machine Learning

Neural networks are built from matrices. Every AI model you use (ChatGPT, image recognition) runs on matrix math!

๐Ÿ“Š Data Analysis

Organize and analyze large datasets. Matrices help find patterns in millions of data points (like Netflix recommendations).

โšก Engineering

Model electrical circuits, structural forces, and quantum mechanics. Engineers use matrices to predict how systems behave.

๐Ÿ” Cryptography

Encrypt secret messages! Some encryption methods multiply your message by a secret matrix to scramble it.

๐ŸŽฏ Lesson 2: Matrix Arithmetic

Learning Objectives:
โœ“ Master matrix addition and subtraction
โœ“ Understand matrix multiplication (it's different!)
โœ“ Learn transpose, determinant, and inverse operations

Part 1: Addition & Subtraction (The Easy Ones!)

Good news: Adding matrices is like adding regular numbers โ€” just do it element-by-element!

Example: Add these 2ร—2 matrices
โ”‚ 1  2 โ”‚
โ”‚ 3  4 โ”‚
Matrix A
+
โ”‚ 5  6 โ”‚
โ”‚ 7  8 โ”‚
Matrix B
=
โ”‚  6   8 โ”‚
โ”‚ 10  12 โ”‚
Result

Each element adds: (1+5=6, 2+6=8, 3+7=10, 4+8=12)

โš ๏ธ Important Rule: You can only add/subtract matrices with the SAME dimensions! A 2ร—3 matrix + 2ร—3 matrix = โœ“ Valid. A 2ร—2 matrix + 3ร—3 matrix = โœ— Error!

Part 2: Matrix Multiplication (The Tricky One!)

Warning: Matrix multiplication is NOT element-by-element! It's more like "mix and combine".

The Rule: To multiply matrices A ร— B:
  1. Take a row from A
  2. Take a column from B
  3. Multiply corresponding elements and add them up
  4. That's one entry in the result!
Example: Calculate entry (1,1) of A ร— B
Row 1 of A = [1, 2]
Column 1 of B = [5, 7]

Result = (1ร—5) + (2ร—7) = 5 + 14 = 19
๐Ÿ’ก Dimension Rule: You can only multiply A ร— B if columns_of_A = rows_of_B. A (2ร—3) ร— B (3ร—4) = โœ“ Valid (result is 2ร—4). A (2ร—3) ร— B (4ร—2) = โœ— Invalid!

Part 3: Transpose (The Flipper!)

The transpose of a matrix flips rows into columns (and columns into rows).

โ”‚ 1  2  3 โ”‚
โ”‚ 4  5  6 โ”‚
Original (2ร—3)
โ†’ Aแต€ โ†’
โ”‚ 1  4 โ”‚
โ”‚ 2  5 โ”‚
โ”‚ 3  6 โ”‚
Transposed (3ร—2)

Notice: Rows became columns! The first row [1,2,3] became the first column.

Part 4: Determinant (The "Size" of a Matrix)

The determinant tells you if a matrix is "invertible" and measures scaling. For 2ร—2 matrices:

โ”‚  a   b  โ”‚
โ”‚  c   d  โ”‚
det(A) = ad - bc
Example:
Matrix: โ”‚  2   3  โ”‚
        โ”‚  1   4  โ”‚

det = (2ร—4) - (3ร—1) = 8 - 3 = 5
๐Ÿ” What does determinant mean?
โ€ข det = 0 โ†’ Matrix is "singular" (not invertible, collapses space)
โ€ข det > 0 โ†’ Matrix preserves orientation
โ€ข det < 0 โ†’ Matrix flips orientation (mirrors)
โ€ข |det| = scaling factor (how much the matrix stretches area/volume)
Exercise 3: Determinant

Find det(A) for:

          A =       
              โ”‚ 3  -2โ”‚
              โ”‚ 1   4โ”‚
                    
Show Solution
det(A) = (3ร—4) - (-2ร—1) = 12 - (-2) = 12 + 2 = 14

๐Ÿ’ป Interactive Matrix Calculator

Now try these operations yourself! Load presets or enter your own matrices.

Quick Load Presets:

ร—

๐ŸŽฏ Lesson 3: Meet the Matrix Celebrities

Learning Objectives:
โœ“ Recognize identity, diagonal, rotation, and symmetric matrices
โœ“ Understand what makes each matrix type special
โœ“ Know when to use each type in practice

Just like numbers have special cases (0, 1, ฯ€), matrices have "celebrities" โ€” special types that show up everywhere!

๐ŸŸฆ Identity Matrix (I) โ€” The "Number 1" of Matrices

What is it? Ones on the diagonal, zeros everywhere else.

โ”‚ 1  0  0 โ”‚
โ”‚ 0  1  0 โ”‚
โ”‚ 0  0  1 โ”‚
Superpower: Multiplying by I does nothing!
A ร— I = A (just like 5 ร— 1 = 5)
Why it matters: Starting point for transformations, used in algorithms, essential for matrix inverses

๐Ÿ”„ Rotation Matrix โ€” The Spinner

What is it? Rotates vectors by angle ฮธ without stretching.

                  
โ”‚ cos(ฮธ)  -sin(ฮธ) โ”‚
โ”‚ sin(ฮธ)   cos(ฮธ) โ”‚
                  
Superpower: Pure rotation, preserves lengths!
Example: ฮธ = 45ยฐ โ†’
cos(45ยฐ) โ‰ˆ 0.707, sin(45ยฐ) โ‰ˆ 0.707
Why it matters: Computer graphics, robotics, animations, GPS systems

๐Ÿ“ Diagonal Matrix โ€” The Scaler

What is it? Only diagonal entries are non-zero.

โ”‚ 2  0  0 โ”‚
โ”‚ 0  5  0 โ”‚
โ”‚ 0  0  3 โ”‚
Superpower: Scales each axis independently!
(x,y,z) โ†’ (2x, 5y, 3z)
Why it matters: Easy to compute powers (Dยฒ = just square each entry!), eigenvalue decompositions

๐Ÿ” Symmetric Matrix โ€” The Mirror

What is it? Equals its own transpose: A = Aแต€ (mirror across diagonal).

         
โ”‚ 4  1  2โ”‚
โ”‚ 1  3  5โ”‚  โ† Notice symmetry
โ”‚ 2  5  6โ”‚     across diagonal!
         
Superpower: Always has real eigenvalues!
Check: Aโ‚โ‚‚ = Aโ‚‚โ‚ = 1 โœ“
Aโ‚โ‚ƒ = Aโ‚ƒโ‚ = 2 โœ“
Aโ‚‚โ‚ƒ = Aโ‚ƒโ‚‚ = 5 โœ“
Why it matters: Physics (stress tensors), optimization, covariance matrices in statistics

๐Ÿชž Orthogonal Matrix โ€” The Preserver

What is it? Preserves lengths and angles. Qแต€Q = I

               
โ”‚ 0.8  -0.6  0โ”‚
โ”‚ 0.6   0.8  0โ”‚
โ”‚ 0     0    1โ”‚
               
Superpower: Rotations + reflections, no distortion!
Distance between points = unchanged
Angles = unchanged
det(Q) = ยฑ1
Why it matters: Computer vision, QR decomposition, numerical stability

โšช Zero Matrix โ€” The Destroyer

What is it? All entries are zero (the "0" of matrices).

         
โ”‚ 0  0  0โ”‚
โ”‚ 0  0  0โ”‚
         
Superpower: Collapses everything to zero!
A ร— 0 = 0
Why it matters: Represents "no transformation", used as initializer, theoretical importance

๐Ÿงช Try It Yourself!

Use the presets below to explore each special matrix type.

๐ŸŽฏ Lesson 4: Matrices in Motion

Learning Objectives:
โœ“ Visualize how matrices transform space
โœ“ Understand rotation, scaling, shearing, and reflection
โœ“ Connect determinant to geometric area changes

The Big Idea: Matrices Transform Space

A 2ร—2 matrix takes every point (x, y) in the plane and moves it to a new location. Think of it like molding clay!

How It Works:
Matrix        times vector      equals new vector         
       โ”‚ a bโ”‚              โ”‚ x โ”‚                  โ”‚ ax + byโ”‚
       โ”‚ c dโ”‚              โ”‚ y โ”‚                  โ”‚ cx + dyโ”‚
                                                       

The matrix "mixes" x and y coordinates to create the new position!

Common Transformations

๐Ÿ”„ Rotation (by ฮธ degrees)

                  
โ”‚ cos(ฮธ)  -sin(ฮธ) โ”‚
โ”‚ sin(ฮธ)   cos(ฮธ) โ”‚
                  
What it does: Spins vectors around the origin
Determinant: Always 1 (no area change!)
Example: ฮธ=90ยฐ โ†’ Rotate left 90 degrees

๐Ÿ“ Scaling (by factors sx, sy)

         
โ”‚ sx   0 โ”‚
โ”‚  0  sy โ”‚
         
What it does: Stretches/squashes along axes
Determinant: sx ร— sy (area multiplier)
Example: [2, 0; 0, 3] โ†’ doubles x, triples y

โ†”๏ธ Shear (slant by k)

       
โ”‚ 1  k โ”‚  โ† Horizontal shear
โ”‚ 0  1 โ”‚
       
What it does: "Slants" space (like pushing cards)
Determinant: Always 1 (area preserved!)
Example: k=1 โ†’ skews parallelogram

๐Ÿชž Reflection (over x-axis)

        
โ”‚  1   0โ”‚
โ”‚  0  -1โ”‚  โ† Flip y-coordinate
        
What it does: Mirrors across an axis
Determinant: -1 (flips orientation!)
Note: Negative det = mirror/flip
๐Ÿ” Determinant Tells The Story:
โ€ข |det| = how much area gets scaled (2 = double area, 0.5 = half area)
โ€ข det > 0 = preserves orientation (no flip)
โ€ข det < 0 = flips orientation (mirror)
โ€ข det = 0 = collapses to a line (singular matrix!)

๐Ÿ’ป Interactive Transformer

Try transforming the unit square! Enter matrix values or use presets.

Enter Matrix Values

Enter Matrix Values
Determinant: 1.00
Area scaling factor
(negative = orientation flip)
Click plot to set vector
Click plot to set vector
Interactive Controls: โ— Original Vector (blue) | โ— Transformed Vector (green)
Grid: coordinate system Click on plot to change vector position

๐ŸŽฏ Lesson 5: Solving Systems with Matrices

Learning Objectives:
โœ“ Convert equation systems to matrix form (Ax = b)
โœ“ Solve systems using matrix operations
โœ“ Understand when systems have unique, infinite, or no solutions

Why Matrices for Equations?

Remember solving systems by hand? Painful! Matrices make it systematic and scalable to 100s of variables.

Step 1: Convert to Matrix Form
System of Equations:
2x + 3y = 8
x - y = 1
โ†’
Matrix Form (Ax = b):
                  
โ”‚ 2  3โ”‚ ยท โ”‚ xโ”‚ = โ”‚ 8โ”‚
โ”‚ 1 -1โ”‚   โ”‚ yโ”‚   โ”‚ 1โ”‚
                  
   A        x       b

A = coefficients, x = unknowns, b = right-hand sides

Step 2: Solve using Matrix Inverse

If A is invertible (det โ‰  0), the solution is: x = Aโปยนb

Example: Solve the system above
Step 1: Find det(A) = (2)(-1) - (3)(1) = -2 - 3 = -5 โœ“ (non-zero!)

Step 2: Find Aโปยน (use formula for 2ร—2):
Aโปยน = (1/det) ร—         =  (1/-5) ร—         =           
                 โ”‚ d -bโ”‚              โ”‚-1 -3โ”‚     โ”‚ 0.2  0.6โ”‚
                 โ”‚-c  aโ”‚              โ”‚-1  2โ”‚     โ”‚ 0.2 -0.4โ”‚
                                                         

Step 3: Multiply x = Aโปยนb:
                                                
โ”‚ xโ”‚ = โ”‚ 0.2  0.6โ”‚ ยท โ”‚ 8โ”‚ = โ”‚ (0.2ร—8)+(0.6ร—1) โ”‚ = โ”‚2.2โ”‚
โ”‚ yโ”‚   โ”‚ 0.2 -0.4โ”‚   โ”‚ 1โ”‚   โ”‚ (0.2ร—8)+(-0.4ร—1)โ”‚   โ”‚1.2โ”‚
                                                

Answer: x = 2.2, y = 1.2

โœ“ Check: 2(2.2) + 3(1.2) = 4.4 + 3.6 = 8.0 โœ“ | 2.2 - 1.2 = 1.0 โœ“

The Three Cases: How Many Solutions?

โœ… Case 1: Unique Solution (det โ‰  0)

When: Matrix A is invertible (full rank)
Means: Exactly ONE solution exists
Solve: x = Aโปยนb

Example:        det = 2  โ†’  Unique solution!
         โ”‚ 2 3โ”‚
         โ”‚ 0 1โ”‚
              

โš ๏ธ Case 2: Infinite Solutions (det = 0, consistent)

When: Equations are dependent (one is a multiple of another)
Means: Infinite solutions (a line/plane of solutions)
Example: 2x + 4y = 6 and x + 2y = 3 (same line!)

Matrix:        det = 0  โ†’  Infinite solutions
        โ”‚ 2 4โ”‚
        โ”‚ 1 2โ”‚
             

โŒ Case 3: No Solution (det = 0, inconsistent)

When: Equations are contradictory
Means: No solution exists (parallel lines never meet!)
Example: 2x + 4y = 6 and 2x + 4y = 10 (impossible!)

Same matrix but different b โ†’ No solution
๐Ÿ” Quick Check: Calculate det(A) first!
โ€ข det โ‰  0 โ†’ Unique solution guaranteed โœ“
โ€ข det = 0 โ†’ Check if consistent (infinite) or inconsistent (no solution)

๐Ÿ’ป Interactive System Solver

Solve your own linear systems! Enter the number of equations and let the matrix solver work its magic.

๐Ÿงช Practice Problems

Exercise 1: Convert to Matrix Form

Write this system as Ax = b:

3x + 2y = 7
x - 4y = -5
Show Solution
A =           x =        b =     
    โ”‚ 3  2โ”‚        โ”‚ xโ”‚        โ”‚  7โ”‚
    โ”‚ 1 -4โ”‚        โ”‚ yโ”‚        โ”‚ -5โ”‚
                                 
Exercise 2: Solve by Hand

Solve using matrix inverse:

                 
โ”‚ 1 2โ”‚ ยท โ”‚ xโ”‚ = โ”‚ 5โ”‚
โ”‚ 3 4โ”‚   โ”‚ yโ”‚   โ”‚11โ”‚
                 
Show Solution
det(A) = (1ร—4) - (2ร—3) = 4 - 6 = -2 โœ“

Aโปยน = (1/-2) ร—         =         
               โ”‚  4 -2โ”‚     โ”‚ -2  1โ”‚
               โ”‚ -3  1โ”‚     โ”‚1.5 -0.5โ”‚
                                 

x = Aโปยนb =         ร—     =                 =    
           โ”‚ -2  1โ”‚   โ”‚ 5โ”‚   โ”‚(-2ร—5)+(1ร—11)โ”‚   โ”‚ 1โ”‚
           โ”‚1.5 -0.5โ”‚   โ”‚11โ”‚   โ”‚(1.5ร—5)+(-0.5ร—11)โ”‚   โ”‚ 2โ”‚
                                                

Answer: x = 1, y = 2
Exercise 3: Determine Solution Type

How many solutions does each system have?

System A:              System B:      
          โ”‚ 2 6โ”‚                  โ”‚ 1 2โ”‚
          โ”‚ 1 3โ”‚                  โ”‚ 3 5โ”‚
                                      
Show Solution
System A: det = (2ร—3) - (6ร—1) = 6 - 6 = 0
โ†’ Either infinite solutions or no solution (need to check consistency)

System B: det = (1ร—5) - (2ร—3) = 5 - 6 = -1 โ‰  0
โ†’ Unique solution exists! โœ“

๏ฟฝ Lesson 6: Eigenvalues โ€” The Magic Directions

Learning Objectives:
โœ“ Understand what eigenvalues and eigenvectors represent geometrically
โœ“ Learn how to find eigenvalues (det(A - ฮปI) = 0)
โœ“ Recognize why eigenvalues matter in applications

The Mind-Blowing Idea

Most matrices rotate AND stretch vectors. But there are special directions where the matrix ONLY stretches!

The Eigenvalue Equation
Av = ฮปv
A
The matrix
v
Eigenvector
(special direction)
ฮป
Eigenvalue
(stretch factor)

"Matrix A just scales vector v by ฮป โ€” no rotation!"

Example: See It In Action

Matrix A =          Eigenvector v =        Eigenvalue ฮป = 4
           โ”‚ 3 1โ”‚                    โ”‚ 1โ”‚
           โ”‚ 0 4โ”‚                    โ”‚ 0โ”‚
                                       

Let's verify Av = ฮปv:

Av =       ร—     =         =    
     โ”‚ 3 1โ”‚   โ”‚ 1โ”‚   โ”‚ 3ร—1+1ร—0โ”‚   โ”‚ 3โ”‚
     โ”‚ 0 4โ”‚   โ”‚ 0โ”‚   โ”‚ 0ร—1+4ร—0โ”‚   โ”‚ 0โ”‚
                                   โ† This should equal 4v

ฮปv = 4 ร—     =       โ† They DON'T match! So v=[1,0] is NOT an eigenvector.
         โ”‚ 1โ”‚   โ”‚ 4โ”‚
         โ”‚ 0โ”‚   โ”‚ 0โ”‚
                  

Try v =    :
        โ”‚ 1โ”‚
        โ”‚ 1โ”‚
           

Av =       ร—     =        and    4v =       โœ“ Match! This IS an eigenvector!
     โ”‚ 3 1โ”‚   โ”‚ 1โ”‚   โ”‚ 4โ”‚                  โ”‚ 4โ”‚
     โ”‚ 0 4โ”‚   โ”‚ 1โ”‚   โ”‚ 4โ”‚                  โ”‚ 4โ”‚
                                         

How to Find Eigenvalues (The Algorithm)

Step 1: Set up the characteristic equation
det(A - ฮปI) = 0

Why? We need Av = ฮปv, which rearranges to (A - ฮปI)v = 0.
This only has non-trivial solutions when det(A - ฮปI) = 0.
Step 2: Calculate the determinant (gives a polynomial)
Example: A =      
             โ”‚ 4 2โ”‚
             โ”‚ 1 3โ”‚
                  

A - ฮปI =          
         โ”‚ 4-ฮป   2โ”‚
         โ”‚  1  3-ฮปโ”‚
                  

det(A - ฮปI) = (4-ฮป)(3-ฮป) - (2)(1)
            = 12 - 4ฮป - 3ฮป + ฮปยฒ - 2
            = ฮปยฒ - 7ฮป + 10
Step 3: Solve for ฮป (find roots)
ฮปยฒ - 7ฮป + 10 = 0
(ฮป - 5)(ฮป - 2) = 0

Eigenvalues: ฮปโ‚ = 5, ฮปโ‚‚ = 2
Step 4: Find eigenvectors (plug ฮป back in)
For ฮป = 5:  (A - 5I)v = 0  โ†’      v = 0  โ†’  Eigenvector vโ‚ =    
                                   โ”‚-1  2โ”‚                       โ”‚ 2โ”‚
                                   โ”‚ 1 -2โ”‚                       โ”‚ 1โ”‚
                                                                 

Why Eigenvalues Matter

๐ŸŽฌ Google PageRank
The web is a giant matrix! Google finds the "importance" eigenvector of this matrix.
๐Ÿ”ฌ Physics & Engineering
Vibration modes, quantum mechanics, stability analysis โ€” all use eigenvalues!
๐Ÿ“Š Principal Component Analysis (PCA)
Data science's favorite tool! Uses eigenvectors to find important patterns in data.
๐ŸŽฎ Computer Graphics
Eigenvectors define "principal axes" for 3D object rotations.
๐Ÿ” Quick Facts:
โ€ข An nร—n matrix has n eigenvalues (counting multiplicities)
โ€ข Symmetric matrices โ†’ always real eigenvalues
โ€ข Trace(A) = sum of eigenvalues
โ€ข det(A) = product of eigenvalues
โ€ข Eigenvalues of diagonal matrices = the diagonal entries!

๐Ÿ’ป Compute & Visualize

Generate a random square matrix to explore its eigenvalues and eigenvectors.

๐Ÿ“Š Visualization

๐Ÿ’ก How to read this plot:
โ€ข Arrows (Real Eigenvalues): Show the eigenvectors ($v$) and how they get scaled ($\lambda v$).
โ€ข Points (Complex Eigenvalues): Show the eigenvalues on the complex plane (Rotation + Scaling).

๐ŸŽฏ Lesson 7: SVD & PCA โ€” Data Science Superpowers

Learning Objectives:
โœ“ Understand Singular Value Decomposition (A = UฮฃVแต€)
โœ“ Learn how SVD enables data compression
โœ“ Master Principal Component Analysis (PCA) for dimensionality reduction

SVD: The Ultimate Matrix Factorization

Every matrix can be broken down into three simple pieces!

A = U ฮฃ Vแต€
A
Original
Matrix
=
U
Rotate
(left)
ฮฃ
Stretch
(diagonal)
Vแต€
Rotate
(right)

Any transformation = Rotate โ†’ Stretch โ†’ Rotate!

The Three Pieces Explained

U (Left Singular Vectors)
โ€ข Orthogonal matrix (columns are perpendicular)
โ€ข Defines output space directions
โ€ข Size: m ร— m
ฮฃ (Singular Values)
โ€ข Diagonal matrix with ฯƒโ‚ โ‰ฅ ฯƒโ‚‚ โ‰ฅ ... โ‰ฅ 0
โ€ข Shows "importance" of each direction
โ€ข Large ฯƒ = important, small ฯƒ = can be dropped!
Vแต€ (Right Singular Vectors)
โ€ข Orthogonal matrix (rows are perpendicular)
โ€ข Defines input space directions
โ€ข Size: n ร— n

๐Ÿ—œ๏ธ The Magic: Data Compression!

Keep only the TOP k singular values โ†’ throw away the rest โ†’ get close approximation with less data!

Example: Compress an Image
Original 1000ร—1000 image matrix = 1,000,000 numbers

After SVD: Keep top 50 singular values
โ†’ Storage needed: 50 ร— (1000 + 1000 + 1) โ‰ˆ 100,000 numbers
โ†’ Compression ratio: 10x smaller! Only 10% of original size

Result: Image looks almost identical but file is 90% smaller! ๐ŸŽ‰

This is exactly what JPEG compression does! (with a twist using DCT, a cousin of SVD)

๐Ÿ“Š PCA: Finding Patterns in Data

Principal Component Analysis uses SVD to find the "most important directions" in your data.

The PCA Recipe (4 Steps)
  1. Center the data: Subtract mean from each feature
  2. Compute covariance matrix: Shows how features vary together
  3. Find eigenvalues & eigenvectors: Get principal components (eigenvectors of covariance matrix)
  4. Project onto top components: Reduce from 100D โ†’ 2D!
Real Example: Netflix Recommendations
โ€ข You have 10,000 movies (10,000 dimensions!)
โ€ข PCA finds ~50 "hidden factors" (action, romance, comedy, etc.)
โ€ข Each user's taste = combination of these 50 factors
โ€ข 10,000D โ†’ 50D = 200x compression! ๐Ÿš€
๐Ÿ’ก Key Insight: Not all directions matter equally! SVD/PCA find the directions with the MOST variation. Drop the boring directions, keep the interesting ones.

๐Ÿ’ป Interactive SVD Decomposition

Generate a random matrix or enter one to see its Singular Value Decomposition.

๐ŸŽฏ Interactive PCA Analysis

Generate a random dataset (rows=samples, cols=features) and find its principal components.