Visão Geral
Este curso de treinamento prático e ao vivo de Deep Learning com Python se baseia em nosso Curso Data Science with Python e ensina aos participantes os fundamentos do Deep Learning e como implementar aplicativos de rede neural artificial (ANN), usando Keras e TensorFlow.
Conteúdo Programatico
Introduction to Artificial Neural Networks (ANNs) and Deep Learning
- Why artificial neural networks? Advantages of ANNs
- Understanding the essential concepts
- Activation functions, optimizers, back-propagation
- Components and architectures of artificial neural networks
- Evaluate the performance of neural networks on a known function
- Define and monitor convergence of a neural network
- Model selection
- Scoring new datasets with a model
Constructing Deep Learning Models
- Preprocessing structured datasets for Deep Learning workflows
- Model validation strategies
- Architectural modifications to manage generalization error
- Regularization strategies
- Deep Learning: regression models
- Deep Learning: classification models
Introduction to Image Processing with Python and Keras
- Management and preparation of image data for Deep Learning models
- The dimensionality of image data
- Handling image metadata
- Conversion of images to NumPy arrays
- Python Image Library (PIL) and skimage
- Keras' load_img() function
- Image standardization and resampling
- Augmentation strategies for image data
Deep Learning for Image Classification with Convolutional Architectures
- Image data is multidimensional
- Overview of convolutional architectures
- Convolution layers act as filters
- Pooling layers reduce computation
- Data augmentation through image transformation for smaller datasets
- Image transformation using the pillow library
- Applying a model to a multi class labeled dataset
- Evaluating a confusion matrix for multiple classes
Time Series Forecasting with Deep Recurrent Architectures
- Identify limitations of feed-forward ANN architectures for sequential data
- Modify model architecture to include recurrent (RNN) components
- Preprocessing time series data for ingestion into RNN models
- Examine improvements to RNNs: The LSTM and GRU networks
- Time series forecasting with recurrent architectures
- Time series forecasting with 1D convolutional architectures
Deep Learning and Natural Language Processing (NLP)
- Text manipulation with TensorFlow
- Categorical representations and word embeddings
- Text embeddings as layers in an ANN
- Word2vec
- Exploiting pre-trained word embedding models
- Visualizing semantic relationships between words using t-SNE
Transfer Learning
- Exploiting pre-trained models (VGG16) for image classification
- Selecting layers to unlock for specific applications
- Transfer learning and fine tuning
Variational Autoencoders
- What is an autoencoder?
- Building a simple autoencoder from a fully connected layer
- Sparse autoencoders
- Deep convolutional autoencoders
- Applications of autoencoders to image denoising
- Sequential autoencoders
- Variational autoencoders
Generative Adversarial Networks (GANs)
- Adversarial examples
- Generational and discriminative networks
- Building a simple generative adversarial network
- Generating images with a GAN
Transformer Architectures
- The problems with recurrent architectures for sequential data
- Attention-based architectures
- Positional encoding
- The Transformer: attention is all you need
- Time series classification using transformers
- GPT-3 and the future of natural language generation
- Open AI Codex and the future of programmatic code generation