Leadership & Team Management
Lead and mentor a team of DataOps engineers in designing and maintaining robust data pipelines.
Plan, assign, and review team tasks to ensure timely and quality delivery.
Collaborate with data engineers, data scientists, and business teams to prioritize data needs and ensure alignment with organizational goals.
Drive best practices in coding standards, documentation, and deployment automation.
Technical Delivery
Design and implement scalable ETL/ELT pipelines using Pentaho, StreamSets, and Python-based frameworks.
Manage real-time and batch data ingestion using Kafka for streaming and MySQL/Snowflake for storage and transformation.
Implement and maintain data quality checks, validation, and reconciliation frameworks.
Ensure pipeline observability, error handling, and alerting mechanisms for proactive issue resolution.
Optimize Snowflake and MySQL queries for performance and cost efficiency.
Lead migration or modernization initiatives (e.g., on-prem to Snowflake/cloud).
Governance & Operations
Maintain data security, access control, and compliance with enterprise standards.
Define and track DataOps KPIs such as pipeline success rates, latency, and data quality metrics.
Partner with Infrastructure and DevOps teams for seamless environment management and scalability.
Technical Skills Required:
Databases:
Strong expertise in MySQL (query optimization, stored procedures, schema design).
Advanced knowledge of Snowflake (data modelling, performance tuning, cost optimization).
ETL & Data Pipeline Tools:
Hands-on experience with Pentaho Data Integration (Kettle) and/or StreamSets for ETL/ELT automation.