Scripting: Proficiency in Python for data processing, scripting, and interaction with APIs.
Preferred Qualifications
Snowflake Expertise (5+ years): Proven experience designing and implementing solutions on Snowflake. Deep understanding of its architecture, specifically Virtual Warehouses, Time Travel, Zero-Copy Cloning, and Tasks.
Expert SQL (7+ years): Mastery of writing highly efficient and complex SQL for large datasets, including window functions, recursive CTEs, and procedural logic.
Stored Procedure Development: Extensive experience developing and debugging stored procedures in a modern data warehousing environment (preferably Snowflake SQL/JavaScript/Python).
Data Warehousing: Solid background in data warehousing concepts, including ELT/ETL principles, data modeling techniques (dimensional, 3NF), and change data capture (CDC).
Scripting: Proficiency in Python for data processing, scripting, and interaction with APIs.
Apache Airflow (2+ years): Hands-on experience developing, deploying, and managing production-grade Airflow DAGs. Knowledge of operators, sensors, and best practices for Airflow deployment.
Cloud Services: Experience with cloud environments (AWS, Azure, or GCP) for infrastructure supporting data pipelines (e.g., S3, ADLS, GCS).
Git/CI/CD: Experience with version control (Git) and CI/CD pipelines for deploying data warehouse changes and orchestration code.
Familiarity with data governance tools and practices.
Any other location / shift timing for the requirement:No
:Snowflake + SQl+ Store procedure - Primary Skill
Airflow - Secondary skill
Job Type: Contractual / Temporary
Contract length: 10 months
Pay: Up to ₹2,000,000.00 per year
Work Location: Remote
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.