:
Key Responsibilities:Design, develop, and maintain scalable store procedures using SQL, Python, and Snowflake.
Leverage Flyway for database version control and schema migration across multiple environments.
Build and manage data orchestration workflows using tools such as Apache Airflow, Prefect, or similar frameworks.
Implement robust CI/CD pipelines using Bitbucket Pipelines for deployment and automation of data engineering tasks.
Optimize and monitor data workflows to ensure performance, reliability, and scalability.
Work with AWS services like S3, Lambda and RDS for cloud-based data engineering tasks.
Collaborate with Data Analysts, Scientists, and other Engineers to understand requirements and deliver reliable data solutions.
Maintain documentation of data flows, architecture, and technical standards.
Ensure data quality, security, and compliance with organizational and regulatory standards.
Required Skills:SQL: Advanced skills for data extraction, transformation, and optimization.
Python: Strong experience in data processing, scripting, and automation.
Snowflake: Deep knowledge of Snowflake architecture, performance tuning, and best practices.
Flyway: Proven experience in managing database migrations and versioning.
AWS: Hands-on experience with cloud data services (S3, Lambda, Glue, etc.).
Orchestration Frameworks: Experience with Airflow, Prefect, or similar tools.
CI/CD: Experience in setting up and managing Bitbucket Pipelines or equivalent.
Strong problem-solving, debugging, and analytical skills.
Excellent communication and collaboration abilities.
Location:
DGS India - Bengaluru - Manyata H2 block
Brand:
Merkle
Time Type:
Full time
Contract Type:
Permanent
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.