Senior Data Platform Engineer

Year    MH, IN, India

Job Description

Job Title:

Senior Data Platform Engineer

Location:

Pune, India

Work Mode:

Work From Office (WFO), 5 Days a Week

Shift Timing:

12:00 PM - 9:00 PM IST


About Zywave





Zywave is a leading provider of InsurTech solutions, empowering insurance brokers and agencies with innovative software tools to grow and manage their business. We are building a modern data platform to deliver scalable, secure, and high-performance solutions that drive actionable insights across the organization.


Job Summary





We are looking for a

highly skilled and experienced Senior Data Platform Engineer

to lead the design, development, and optimization of our enterprise data platform. The ideal candidate will have deep expertise in

Snowflake, ELT pipelines, DBT, and Azure Data Factory

and will play a key role in enabling data-driven decision-making across Zywave.


Key Responsibilities




Design and implement scalable

ELT pipelines

using DBT and Azure Data Factory to ingest, transform, and load data into Snowflake. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver robust data models. Optimize

Snowflake performance

through clustering, partitioning, and query tuning. Develop and maintain reusable

DBT models and documentation

for data consistency and transparency. Ensure

data quality, governance, and security

across the platform. Monitor and troubleshoot pipeline issues; implement proactive and scalable solutions. Lead

code reviews

, mentor junior engineers, and drive best practices in data engineering. Stay current with emerging technologies and recommend enhancements to the data platform architecture.

Qualifications




Bachelor's or Master's degree in Computer Science, Engineering, or related field.

8+ years

of experience in data engineering or data platform development. Strong hands-on expertise in

Snowflake, DBT, Azure Data Factory, and ELT pipeline design

. Proficiency in

SQL and Python

for data manipulation and automation. Experience with

CI/CD tools

and version control systems (e.g., Git). Familiarity with

data governance, security, and compliance

standards. Strong problem-solving skills and ability to work independently as well as in teams. Solid understanding of

data warehousing concepts and dimensional modeling

. Exposure to

Tableau, Power BI, or similar visualization tools

is a plus.

Snowflake certification

is an advantage.

Skills




Mandatory:

Git, Snowflake, DBT, Python, SQL

Good to Have:

Azure Data Factory, ETL, AWS Glue, Tableau, Power BI, Prompt Engineering

Domain Knowledge:

Insurance domain exposure preferred





#LI-MB1

Mandatory:

Git, Snowflake, DBT, Python, SQL

Good to Have:

Azure Data Factory, ETL, AWS Glue, Tableau, Power BI, Prompt Engineering

Domain Knowledge:

Insurance domain exposure preferred

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD4095176
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    MH, IN, India
  • Education
    Not mentioned
  • Experience
    Year