to lead the design and implementation of scalable, high-performance data pipelines using
Snowflake and dbt
. You will define architectural best practices, optimize data platforms, and mentor junior engineers while collaborating with clients to deliver robust, production-grade data solutions.
Key Responsibilities
Architect and implement modular, test-driven ELT pipelines using
dbt on Snowflake
.
Design layered data models (staging, intermediate, marts / medallion architecture) aligned with dbt best practices.
Lead ingestion of structured & semi-structured data (APIs, flat files, cloud storage like
integrated with CI/CD pipelines.
Define and enforce
data governance & compliance
(RBAC, secure data sharing, encryption).
Collaborate with analysts, data scientists, architects, and stakeholders to deliver validated business-ready datasets.
Mentor junior engineers, lead code/architecture reviews, and establish reusable frameworks.
Manage end-to-end project delivery in a
client-facing consulting environment
.
Required Qualifications
5-8 years of data engineering experience
, with
3+ years hands-on with Snowflake & dbt
in production.
Strong expertise in
SQL & Snowflake
(performance optimization, clustering, cost management).
Solid experience in