: ETL Modernization Engineer (OSP Partner)
This role is for an OSP provider resource to migrate legacy ETL pipelines (Informatica/Talend) to a modern data stack on Snowflake, dbt, and Fivetran, with strong DataOps, governance, and cost-aware design. Target experience: 7 to 8 years.
Role Summary
Lead end to end migration of legacy ETL workloads to Snowflake dbt Fivetran, including discovery, design, refactoring, validation, and cutover.
Establish reusable patterns for ingestion, transformation, orchestration, testing, observability, and cost optimization.
Collaborate with Enterprise Data Architecture, Security, and BI teams to ensure compliant, high-performance delivery.
Key Responsibilities
Assess current-state: inventory Informatica/Talend jobs, mappings, schedules, dependencies, SLAs, and data contracts.
Design target-state: ingestion via Fivetran/ELT, dbt-based transformations, Snowflake schemas (raw/bronze, curated/silver, semantic/gold), and orchestration approach.
Migrate and refactor:
Convert mappings/workflows to dbt models, macros, seeds, and exposures.
Replace hand-coded ingestions with managed connectors (Fivetran) or alternative ELT where required.
Implement CDC patterns (e.g., Fivetran + dbt snapshots), SCD handling, and incremental strategies.
Data quality and testing: implement dbt tests (schema, referential, accepted values, freshness), anomaly checks, and reconciliation with legacy outputs.
Performance engineering: optimize Snowflake warehouses, clustering/partitioning strategies, query tuning, caching/materialization patterns, and costs.
Security and governance: apply RBAC roles, masking, row access policies; integrate with lineage/catalog (e.g., OpenLineage/Marquez, Collibra/Atlan/Alation if applicable).
Observability and reliability: configure logging/metrics, job run health, SLAs/SLOs, alerting, and incident runbooks.
Cutover planning: parallel runs, backfills, data reconciliation, defect triage, rollout, and decommissioning legacy jobs.
Documentation and knowledge transfer: architecture diagrams, runbooks, playbooks, and training for client teams.
Required Skills and Experience
7 to 8 years in data engineering/ETL modernization with demonstrable migration projects from Informatica or Talend to ELT on cloud data warehouses.
Hands on with Snowflake (warehouses, tasks, streams, time travel, Query Profile), performance tuning, security policies, and cost governance.
Strong dbt expertise: model design, Jinja/macros, packages, exposures, snapshots, environment promotion, and CI/CD with Git.
Practical Fivetran experience: connector configuration, sync scheduling, historical backfills, log based CDC, schema drift handling.
Proficient SQL (advanced), with Python preferred for utilities e.g., migration scripts, validation
Experience building star/snowflake schemas, dimensional modeling, and semantic layers for BI tools Power BI/Looker/Tableau
DataOps and CI/CD: branching strategies, automated tests, deployment pipelines, environment management.
Data quality frameworks and reconciliation techniques for migration sign off.
Strong stakeholder management: work with architects, security, BI, and business SMEs in phased migration programs.
Deliverables
Current state inventory and dependency map of legacy pipelines.
Target state architecture and migration plan, including cutover strategy.
Re platformed pipelines: Fivetran connectors, dbt project(s), Snowflake schemas, roles, and policies.
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.