. This role is ideal for someone who thrives in distributed systems and has hands-on experience with
payment systems
. You'll be part of a dynamic team building scalable data pipelines and orchestrating workflows across cloud platforms.
Key Responsibilities
Design and implement scalable data pipelines using Spark (Dataproc/Databricks)
Develop robust ETL processes in Python
Manage and optimize Delta Lake architectures for high-performance analytics
Build and maintain cross-cloud orchestration workflows (e.g., GCP, AWS, Azure)
Integrate and support payment system data flows and compliance requirements
Collaborate with data scientists, analysts, and product teams to deliver insights
Ensure data quality, security, and governance across platforms
Required Skills
7+ years of experience in data engineering or related field
Strong proficiency in Spark (Dataproc or Databricks) and Python
Hands-on experience with Delta Lake and cloud data architectures
Proven ability to orchestrate workflows across multiple cloud environments
Familiarity with payment systems and related data compliance standards
Excellent problem-solving and communication skills
Nice to Have
Experience with Airflow, Prefect, or similar orchestration tools
Knowledge of CI/CD pipelines and DevOps practices
Exposure to real-time data streaming (Kafka, Pub/Sub)
Job Types: Full-time, Contractual / Temporary, Freelance
Contract length: 6 months
Pay: ₹80,000.00 - ₹100,000.00 per month
Work Location: Remote
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.