Visão Geral
O Curso Kafka with Node.js foi desenvolvido para desenvolvedores que desejam construir aplicações escaláveis e reativas utilizando Node.js integradas ao Apache Kafka.
Durante o treinamento, os participantes aprenderão a criar producers, consumers e stream processors, configurar tópicos, integrar com Kafka Cluster e implementar event-driven architectures em Node.js.
O curso combina teoria, exemplos práticos e laboratórios para capacitar os alunos a criar pipelines de dados confiáveis e escaláveis com Node.js e Kafka.
Conteúdo Programatico
Module 1: Introduction to Kafka with Node.js
- Kafka architecture and concepts for Node.js developers
- Setting up development environment (Node.js, Kafka, Docker)
- Overview of Node.js Kafka client libraries (kafkajs, node-rdkafka)
- Creating basic producers and consumers
Module 2: Producers and Consumers in Node.js
- Sending and receiving messages asynchronously
- Configuring retries, acknowledgments, and batching
- Serialization with JSON and Avro
- Error handling and logging
Module 3: Advanced Kafka Features
- Working with partitions and keys for message ordering
- Consumer groups and offset management
- Implementing idempotency and exactly-once semantics
- Transactional messaging in Node.js
Module 4: Stream Processing with Node.js
- Introduction to stream processing concepts
- Using Node.js for lightweight stream processing
- Stateful vs stateless processing
- Event transformations, filtering, and aggregation
Module 5: Integration with Kafka Ecosystem
- Using Schema Registry for Avro serialization
- Integrating with Kafka Connect and external systems
- Monitoring and logging Node.js Kafka applications
- Handling schema evolution and compatibility
Module 6: Docker and Deployment
- Containerizing Node.js applications with Docker
- Running Kafka and Zookeeper in Docker Compose
- Deploying producers and consumers in containers
- Scaling applications and monitoring containerized services
Module 7: Troubleshooting and Performance
- Diagnosing connectivity and configuration issues
- Monitoring consumer lag and throughput
- Optimizing performance for producers and consumers
- Logging, metrics collection, and alerting
Module 8: Hands-On Project
Project: Build a complete Node.js Kafka pipeline with producers, consumers, stream processing, Avro serialization, and Dockerized deployment.