Curso RabbitMQ com treams de Eventos

  • DevOps | CI | CD | Kubernetes | Web3

Curso RabbitMQ com treams de Eventos

24 horas
Visão Geral

O curso RabbitMQ com Streams de Eventos ensina como projetar, implementar e operar pipelines orientados a eventos de alta performance usando RabbitMQ Streams. Os alunos aprenderão desde os fundamentos até práticas avançadas, incluindo escalabilidade, particionamento, monitoramento, retenção, offsets, replay e integração com microserviços e ferramentas de processamento de dados em tempo real.

Objetivo

Após realizar este curso RabbitMQ com Streams de Eventos, você será capaz de:

  • Entender profundamente o modelo de Streams do RabbitMQ
  • Criar pipelines completos orientados a eventos
  • Configurar, operar e monitorar stream queues
  • Construir produtores e consumidores de alto throughput
  • Trabalhar com offsets, retenção e replay de eventos
  • Implementar Super Streams para alta escalabilidade
  • Projetar integrações event-driven entre microserviços
  • Criar arquiteturas robustas de streaming de dados
Publico Alvo
  • Desenvolvedores backend
  • Engenheiros de software
  • Arquitetos de soluções
  • Engenheiros DevOps e SRE
  • Especialistas em integração e mensageria
  • Profissionais que trabalham com dados em tempo real
Pre-Requisitos
  •  
  • Noções de mensageria e AMQP
  • Noções de containers (Docker)
  • Experiência em alguma linguagem (Python, Java, Node.js ou Go)
  • Noções de microserviços
  •  
Materiais
Ingles/Portugues
Conteúdo Programatico

Module 1 – Event Streaming Fundamentals (3 hours)

  1. What are events and event streams
  2. Event-driven architecture principles
  3. Comparing Message Queues vs Event Streams
  4. RabbitMQ architecture refresher
  5. When to use RabbitMQ Streams

Module 2 – RabbitMQ Streams Deep Dive (3 hours)

  1. How Streams work internally
  2. Segments, indexes, retention policies
  3. Sequential write architecture
  4. Advantages over traditional AMQP queues
  5. Hands-on: Exploring stream structure

Module 3 – Installing & Configuring RabbitMQ Streams (2 hours)

  1. Installing RabbitMQ with streams enabled
  2. Configuring plugins and tuning parameters
  3. Storage, durability, disk usage
  4. Hands-on: Setting up a full stream environment

Module 4 – Stream Producers (3 hours)

  1. Producer API architecture
  2. Batching, compression, flow control
  3. Exactly-once publishing and idempotence
  4. Error handling and retries
  5. Hands-on: Building a high-throughput producer

Module 5 – Stream Consumers (3 hours)

  1. Offset-based consumption
  2. Replay strategies
  3. Consumer groups and horizontal scaling
  4. Backpressure and flow management
  5. Hands-on: Implementing high-speed consumers

Module 6 – Super Streams & Partitioning (3 hours)

  1. What are Super Streams
  2. Partitioning strategies and event keys
  3. Scaling consumers and producers
  4. Ensuring ordering guarantees
  5. Hands-on: Creating a partitioned Super Stream

Module 7 – Event Processing & Data Pipelines (2 hours)


  1. Filtering, aggregation, enrichment
  2. Using RabbitMQ with ETL and analytics tools
  3. Building real-time pipelines
  4. Hands-on: Event transformation exercises

Module 8 – Microservices Integration (2 hours)

  1. Event-driven microservices patterns
  2. Designing asynchronous workflows
  3. Using streams as event logs
  4. Saga and outbox patterns with RabbitMQ Streams
  5. Hands-on: Microservice integration lab

Module 9 – Monitoring & Observability (2 hours)

  1. Stream metrics and monitoring
  2. Prometheus + Grafana dashboards
  3. Detecting consumer lag
  4. Troubleshooting performance issues
  5. Hands-on: Observability lab

Module 10 – Scaling & Advanced Operations (2 hours)

  1. Auto-scaling consumers
  2. Stream storage scaling and disk optimization
  3. Changing retention without downtime
  4. Backup/restore and disaster recovery
  5. Hands-on: Performance tuning scenarios

Module 11 – Real-World Case Studies (1 hour)

  1. IoT streaming
  2. Log ingestion
  3. E-commerce event pipelines
  4. Financial transaction streaming
  5. Clickstream analytics

Module 12 – Final Project (2 hours)

  1. Build a complete end-to-end event streaming solution:
  2. Producer
  3. Consumer
  4. Partitioned Super Stream
  5. Retention & offsets
  6. Monitoring dashboards
  7. Event processing flow
TENHO INTERESSE

Cursos Relacionados

Curso Ansible Red Hat Basics Automation Technical Foundation

16 horas

Curso Terraform Deploying to Oracle Cloud Infrastructure

24 Horas

Curso Ansible Linux Automation with Ansible

24 horas

Ansible Overview of Ansible architecture

16h

Advanced Automation: Ansible Best Practices

32h