At Codvo, software and people transformations go hand-in-hand. We are a global empathy-led technology services company. Product innovation and mature software engineering are part of our core DNA. Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we aspire to live by each day.
We continue to expand our digital strategy, design, architecture, and product management capabilities to offer expertise, outside-the-box thinking, and measurable results.
Role Overview
We are looking for a
hands-on Principal Engineer
with deep expertise in
Databricks
to design, build, and scale enterprise-grade data platforms and MLOps pipelines. You will be the technical authority on how enterprises adopt and maximize Databricks -- from ingestion to governance to machine learning deployment -- and a mentor who raises the bar for engineering excellence.
Key Responsibilities
Platform Architecture
: Design and implement end-to-end data architectures on
Databricks Lakehouse
, covering ingestion, transformation, storage, and analytics.
Pipelines & Workflows
: Build and optimize
ETL/ELT pipelines
with
Delta Live Tables
,
Spark Structured Streaming
, and workflow orchestration.
Governance & Security
: Implement
Unity Catalog
, fine-grained access controls, and compliance frameworks across enterprise data estates.
MLOps at Scale
: Operationalize ML models using
MLflow
,
Model Registry
, and CI/CD pipelines integrated with cloud DevOps tools.
Performance & Cost Optimization
: Tune Databricks clusters, jobs, and workflows for
scale, speed, and efficiency
across multi-cloud deployments.
Client Advisory
: Work closely with enterprise stakeholders to provide
best practices, reference architectures, and accelerators
tailored to their use cases.
Mentorship & Standards
: Guide engineers in Databricks best practices, enforce coding standards, and lead design/code reviews.
Qualifications
8+ years
in large-scale
data engineering / platform engineering
, with
3+ years hands-on Databricks
experience.
Deep expertise in:
+
Databricks Lakehouse Platform
(Delta Lake, Delta Live Tables, Databricks SQL).
+
Governance & Security
with
Unity Catalog
.
+
MLOps with MLflow
and model lifecycle management. Strong programming skills in
PySpark, SQL, Python
; experience with
Scala
a plus.
Hands-on with
cloud integration
(AWS, Azure, or GCP) and DevOps pipelines (Terraform, GitHub Actions, Azure DevOps, etc.).
* Proven track record of
building and scaling Databricks workloads in production
for enterprise clients.
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.