Job Title: Data Engineer Location: Hybrid - Pune Experience Required: 5-7 years Budget: 6-9LPA Notice Period: Immediate joiners to 15 days(only serving notice candidates) Mandatory skills: GCP (Google Cloud Platform), Python, SQL Must-Have Skills: GCP (Google Cloud Platform), Python, SQL Python or Java for building and managing data pipelines Orchestration tools: Cloud Composer (Airflow) Cloud Data Warehousing experience Responsibilities: Design and implement scalable data pipelines (ETL/ELT) to transfer data from on-prem SQL Server to GCP PostgreSQL. Optimize and refactor SQL Server stored procedures, schemas, and data types for PostgreSQL compatibility. Develop automated scripts in Python or Java for migration, data validation, and reconciliation. Apply data governance and security standards across cloud-stored data assets. Collaborate with DBAs, developers, and analysts to assess source systems and define migration architecture. Execute data quality checks and implement robust logging, monitoring, and alerting using GCP tools (e.g., Cloud Monitoring). Leverage tools like pgLoader and custom scripts for schema conversion and row-level parity assurance. Support smooth transition during cutover with minimal downtime Required Expertise: Strong SQL (T-SQL & PostgreSQL) and relational database fundamentals Proven data engineering experience with GCP Hands-on with cloud monitoring, pipeline services, and version control (Git) Deep familiarity with ETL processes and orchestration frameworks (Airflow) Solid scripting abilities (Python/Java) for automation Good-to-Have Skills: Exposure to database migration services (e.g., AWS DMS, GCP DB Migration Service) Real-time data ingestion (Pub/Sub) Shell scripting proficiency CI/CD pipeline integration for data workflows Familiarity with NoSQL databases (Firestore, Bigtable)
Job Type: Full-time
Pay: ?50,000.00 - ?75,000.00 per month
Work Location: In person
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.