The Kafka Streaming Developer will be responsible for designing, developing, and maintaining robust and scalable real-time data streaming pipelines using Apache Kafka and integrating them with Snowflake for data warehousing and analytics. This role requires expertise in Kafka ecosystem components, streaming data ingestion, and optimizing data flow into Snowflake.
Responsibilities:
Design, build, and maintain real-time data streaming pipelines using Apache Kafka, Kafka Connect, Kafka Streams, and ksqlDB.
Develop and implement Kafka producers and consumers for various data sources and targets.
Integrate Kafka with Snowflake using Kafka Connectors (e.g., Snowflake Sink Connector), Snowpipe Streaming, or custom solutions for efficient data ingestion.
Configure and manage Kafka clusters, topics, partitions, and security settings (SSL, SASL, RBAC).
Optimize Kafka cluster performance, ensuring high availability, scalability, and fault tolerance.
Develop and implement data transformations and processing logic within Kafka Streams or other streaming frameworks.
Design and implement scalable Snowflake data warehouse solutions, including tables, views, and materialized views, optimized for analytical queries.
Monitor and troubleshoot Kafka and Snowflake data pipelines, identifying and resolving performance bottlenecks and data discrepancies.
Collaborate with data engineers, data scientists, and software developers to understand data requirements and deliver integrated solutions.
Ensure data quality, integrity, and security across the streaming and warehousing solutions.
Document technical designs, configurations, and operational procedures.
Participate in code reviews and promote best practices in Kafka and Snowflake development.
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.