Design, develop, and maintain scalable and efficient data pipelines using Snowflake, SQL, and ETL tools.
Migrate existing ETL workflows to Snowflake and optimize for performance and cost.
Implement and manage data warehouse architectures, data models, and best practices.
Collaborate with data architects, business analysts, and other stakeholders to gather requirements and ensure data quality and availability.
Work with large datasets, design data transformation frameworks, and develop solutions to support analytics and reporting.
Design and implement monitoring, alerting, and performance tuning strategies.
Ensure data security, privacy, and governance compliance across data assets.
Document technical solutions, data flows, and system configurations.
________________________________________
Required Skills & Qualifications:
Bachelors or Masters degree in Computer Science, Information Systems, or a related field.
10 to 12 years of professional experience in Data Engineering / ETL / Data Warehousing.
Minimum 3 to 5 years of hands-on experience with Snowflake.
Proficient in SQL, Python, and at least one ETL tool (e.g., Informatica, Talend, SSIS, DataStage, DBT).
Deep understanding of data modeling (Star, Snowflake schema), data marts, and dimensional modeling.
Experience with cloud platforms (AWS, Azure, or GCP) and cloud-native data integration tools.
Strong performance tuning, troubleshooting, and debugging skills for Snowflake and ETL jobs.
Familiarity with CI/CD practices and version control tools like Git.
Excellent communication, documentation, and interpersonal skills.
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.