to design, build, and optimise cloud-native data platforms in Azure. You will work across modern data architectures, build scalable data solutions, and collaborate with multidisciplinary teams to drive data excellence and platform innovation.
Location:
Hybrid - Pune, India (3 days onsite)
Target Start:
January 2026
Compensation:
?30,00,000 - ?42,00,000 INR
Responsibilities:
Data Engineering
Design, build, and optimise
cloud-native ingestion, transformation, and enrichment pipelines
using Azure (ADF, Databricks, Synapse).
Implement
data joins, validation frameworks, profiling, and quality checks
to ensure analytics-ready datasets.
Engineer
scalable feature pipelines
and reusable data products for advanced analytics and ML workloads.
Maintain data quality, lineage, and metadata in collaboration with Data Governance teams.
Contribute to automated deployment workflows using
DevOps/CI/CD
, GitHub Actions, Terraform, and related tools.
Data Science
Perform
end-to-end Exploratory Data Analysis (EDA)
to identify trends, anomalies, and insights.
Build, validate, and deploy
machine learning models
(supervised and unsupervised).
Develop
feature engineering logic
, model training pipelines, and performance monitoring processes.
Work with business and technical stakeholders to define analytical problems and deliver data-driven solutions.
Communicate analytical insights and model outcomes to technical and non-technical audiences.
Analytics & Visualisation
Build dashboards and analytical visuals using
Power BI, Tableau
, or similar tools.
Present data findings and insights in a clear, accessible narrative.
What we're looking for in our applicants:
10+ years
of hands-on experience in data engineering within enterprise or cloud-native environments.
Proven experience building
production-grade data pipelines
on Azure (Azure Data Factory, Databricks, Synapse, or equivalent).
Strong proficiency in
Python
for data engineering, automation, and pipeline development.
Deep understanding of
data lake/data warehouse architectures
and modern data platform patterns.
Extensive experience with
data migration
, integration of distributed data sources, and handling large-scale datasets.
Skilled in
DevOps/CI/CD
, Agile delivery, and automated deployment workflows.
Experience collaborating with Data Governance, Cloud Engineering, and Product teams.
Previous experience with
Snowflake
is a plus.
Excellent communication skills with the ability to articulate complex technical concepts clearly.
Why Keyrus?
Joining Keyrus means joining a market leader in the Data Intelligence field and an (inter)national player in Management Consultancy and Digital Experience.
You will be part of a young and ever learning enterprise with an established international network of thought leading professionals driven by bridging the gap between innovation and business. You get the opportunity to meet specialised and professional consultants in a multicultural ecosystem.
Keyrus gives you the opportunity to showcase your talents and potential, to build up experience through working with our clients, with the opportunity to grow depending on your capabilities and affinities, in a great working and dynamic atmosphere.
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.