Visão Geral
Curso Algebra Linear Avancada para Deep Learning. Este curso aprofunda os conceitos de álgebra linear essenciais para o entendimento, análise e otimização de modelos modernos de Deep Learning. O foco está na interpretação geométrica, na formulação matricial de redes neurais profundas e na aplicação direta dos conceitos em backpropagation, convoluções, atenção e modelos fundacionais. O curso conecta teoria matemática rigorosa com aplicações práticas em arquiteturas profundas utilizadas em produção.
Conteúdo Programatico
Module 1: Advanced Vector Spaces
-
Norms, inner products and metrics
-
Orthogonality and projections
-
Basis change and coordinate systems
-
Geometric interpretation of vector spaces
Module 2: Matrix Theory and Linear Transformations
- Linear operators and transformations
- Rank, null space and column space
- Invertibility and conditioning
- Matrix representations in deep models
Module 3: Eigenvalues, Eigenvectors and Stability
- Spectral decomposition
- Diagonalization
- Eigenvalues and training stability
- Spectral radius and gradient behavior
Module 4: Matrix Decompositions
- LU and QR decomposition
- Singular Value Decomposition (SVD)
- Low-rank approximations
- Applications in model compression
Module 5: Tensors and Multidimensional Algebra
- Tensor notation and operations
- Tensor contraction
- Broadcasting and dimensionality rules
- High-dimensional linear algebra
Module 6: Linear Algebra in Backpropagation
- Jacobians and matrix derivatives
- Vectorized gradient computation
- Efficient backpropagation formulations
- Memory and computational trade-offs
Module 7: Linear Algebra in CNNs and Transformers
- Convolutions as linear operators
- Toeplitz matrices and convolution
- Attention as matrix multiplication
- Linear projections in Transformers
Module 8: Numerical Linear Algebra for Deep Learning
- Floating point arithmetic
- Conditioning and numerical stability
- Efficient linear algebra on GPUs
- Practical optimization considerations