Design, develop, and maintain scalable ETL/ELT pipelines in
Snowflake
to support data migration from legacy systems.
Leverage
Python
for data transformation, automation, and orchestration of migration workflows.
Optimize and refactor complex
SQL queries
to ensure efficient data processing and reporting in Snowflake.
Collaborate on
data modeling and schema design
to align with Snowflake architecture and performance best practices.
Monitor and troubleshoot data pipeline performance during and after migration phases.
Work closely with data analysts, scientists, and business stakeholders to ensure accurate and timely data delivery.
Implement and enforce
data governance, security policies
, and access controls within Snowflake.
Collaborate with DevOps teams to integrate data engineering workflows into broader CI/CD frameworks.
Required Skills:
4-6 years of experience in
data engineering
, with proven expertise in
Snowflake
and
Python
.
Strong command of
Snowflake features
such as scripting, time travel, virtual warehouses, and query optimization.
Hands-on experience with
ETL tools
, data integration strategies, and migration methodologies.
Solid understanding of
data warehousing principles
, normalization techniques, and performance optimization.
Familiarity with
cloud platforms
(AWS, Azure, or GCP) and orchestration tools.
Excellent problem-solving skills and ability to work independently in a dynamic, fast-paced environment.
Understanding of version control systems, particularly Git, including branching, merging, and pull request workflows
Experience with Snowflake advanced features including Snowpipe, Streams, Tasks, and Stored Procedures.
Familiarity with ETL orchestration tools such as Airflow, DBT, or Matillion.
* Ability to work with semi-structured data formats like JSON and Parquet.
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.