to design and implement event-driven architectures for real-time banking integrations. The role involves building high-throughput, low-latency streaming solutions, ensuring data quality and governance, and collaborating with multiple business units to deliver scalable, secure, and reliable data streaming solutions.
Key Responsibilities:
Develop and maintain
event-driven architectures
using
Confluent Kafka
for real-time integration across banking systems (Core Banking, CRM, Fraud Detection, Risk, Compliance, etc.).
Design and implement
Kafka producers and consumers
to handle
high-throughput, low-latency
banking transactions.
Build
reusable streaming components
using
Kafka Streams and ksqlDB
for fraud detection, notifications, operational alerts, and analytics.
Collaborate with the
Data Governance team
to ensure data lineage, quality, and metadata standards are met.
Implement
schema evolution best practices
using
Confluent Schema Registry
for compatibility across multiple banking applications.
Work closely with
platform, DevOps, cybersecurity, and analytics teams
for seamless delivery, monitoring, and operational support.
Contribute to
integration roadmaps
, perform
impact assessments
, and ensure alignment with overall data platform architecture.
Develop real-time use cases like
customer onboarding status tracking, transaction streaming, digital engagement analytics, and branch performance monitoring
.
Gather requirements from
Retail, Islamic Finance, Risk, and Compliance business units
and translate them into scalable Kafka-based solutions.
Required Technical Skills:
Java Proficiency:
Strong expertise in
Core Java
(minimum 5+ years).
Kafka Connect APIs:
Ability to implement
custom Source and Sink connectors
, using
Task interfaces
in Java.
Schema Handling:
Experience with
Schema Builder, Struct, schema evolution, and Schema Registry
.
Error Handling & Retries:
Ability to build robust connectors that gracefully handle failures and support fault tolerance.
Integration Expertise:
Hands-on experience connecting to
databases, REST APIs, SOAP services, and external systems
.
Kafka Fundamentals:
Deep understanding of
topics, partitions, offsets, consumer groups, message retention, and scaling strategies
.
SQL Proficiency:
Ability to write complex SQL queries for data ingestion, transformation, and validation.
Connector Logs:
Strong debugging skills using
Confluent Connect Logs
and monitoring tools.
Preferred Qualifications:
Experience in
banking or financial services domain
with large-scale data integration projects.
Familiarity with
microservices architecture
and containerization (Docker, Kubernetes).
Knowledge of
real-time data analytics platforms
and
cloud-based Kafka deployments (AWS/MS Azure/GCP)
.
Exposure to
CI/CD pipelines, monitoring, and alerting frameworks (Prometheus, Grafana, ELK, etc.)
.
Soft Skills:
Strong problem-solving and analytical skills.
Excellent communication and stakeholder management abilities.
Ability to work in
fast-paced, agile environments
with cross-functional teams.
Education:
Bachelor's or Master's degree in
Computer Science, Information Technology, or related field
.
Job Types: Full-time, Permanent
Pay: ?210,853.55 - ?2,517,502.05 per year
Application Question(s):
What's your Notice Period?
Experience:
Kafka Developer: 6 years (Required)
java Kafka Developer: 6 years (Required)
Location:
Pune, Maharashtra (Required)
Work Location: In person
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.