Sr Data Engineer

Year    GJ, IN, India

Job Description

Key Responsibilities:



Architect, design, and optimize

enterprise-grade NiFi data flows

for large-scale ingestion, transformation, and routing. Manage

Kafka clusters at scale

(multi-node, multi-datacenter setups), ensuring high availability, fault tolerance, and maximum throughput. Create custom NiFi processors and develop advanced flow templates and best practices. Handle

advanced Kafka configurations

-- partitioning, replication, producer tuning, consumer optimization, rebalancing, etc. Implement

stream processing

using

Kafka Streams

and manage

Kafka Connect

integrations with external systems (databases, APIs, cloud storage). Design secure pipelines with

end-to-end encryption, authentication (SSL/SASL), and RBAC

for both NiFi and Kafka. Proactively monitor and troubleshoot performance bottlenecks in real-time streaming environments. Collaborate with infrastructure teams for scaling, backup, and disaster recovery planning for NiFi/Kafka. Mentor junior engineers and enforce best practices for data flow and streaming architectures.

Required Skills and Qualifications:



5+ years of

hands-on production experience

with

Apache NiFi

and

Apache Kafka

. Deep understanding of NiFi architecture (flow file repository, provenance, state management, backpressure handling). Mastery over Kafka internals -- brokers, producers/consumers, Zookeeper (or KRaft mode), offsets, ISR, topic configurations. Strong experience with

Kafka Connect

,

Kafka Streams

,

Schema Registry

, and data serialization formats (Avro, Protobuf, JSON). Expertise in tuning NiFi and Kafka for

ultra-low latency

and

high throughput

. Strong scripting and automation skills (Shell, Python, Groovy, etc.). Experience with

monitoring tools

: Prometheus, Grafana, Confluent Control Center, NiFi Registry, NiFi Monitoring dashboards. Solid knowledge of

security best practices

in data streaming (encryption, access control, secret management). Hands-on experience deploying on

cloud platforms

(AWS MSK, Azure Event Hubs, GCP Pub/Sub with Kafka connectors). Bachelor's or Master's degree in Computer Science, Data Engineering, or equivalent field.

Preferred (Bonus) Skills:



Experience with

containerization

and

orchestration

(Docker, Kubernetes, Helm). Knowledge of

stream processing frameworks

like Apache Flink or Spark Streaming. Contributions to open-source NiFi/Kafka projects (a huge plus!).

Soft Skills:



Analytical thinker with exceptional troubleshooting skills. Ability to architect solutions under tight deadlines. Leadership qualities for guiding and mentoring engineering teams. Excellent communication and documentation skills.
pls send your resume on hr@rrmgt.in or call me on 9081819473.

Job Type: Full-time

Pay: From ?1,500,000.00 per year

Work Location: In person

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD3779254
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    GJ, IN, India
  • Education
    Not mentioned
  • Experience
    Year