to join our Data Delivery team and drive the design, development, and optimisation of scalable cloud-based data solutions. This role goes beyond traditional data engineering -- we are looking for someone who can
own the end-to-end data lifecycle
, from pipeline architecture to
data validation, exploratory analysis, and insight presentation
.
The ideal candidate combines
strong data engineering capabilities
with the ability to
collate, analyse, and interpret data
to generate meaningful business value. A foundational understanding of
data science concepts
is expected, even if not at a fully advanced modelling level.
Location:
Hybrid - Pune, India (3 days onsite)
Target Start:
October 2025
Compensation:
?25,00,000 - ?35,00,000 INR
Responsibilities:
Architect, build, and optimise
cloud-native data pipelines
and
big data ecosystems
using
Azure
and
Snowflake
.
Perform
data ingestion, transformation, and enrichment
, ensuring high data quality and reliability across downstream use cases.
Conduct
basic exploratory data analysis (EDA)
and generate
insight-ready datasets
to support business stakeholders and analytics teams.
Present
data findings and performance insights
clearly to both technical and non-technical stakeholders.
Ensure all
Non-Functional Requirements
(security, performance, scalability, DR, compliance) are embedded in design and implementation.
Evaluate and prototype
modern data technologies
, leading POCs to drive innovation and continuous improvement.
Collaborate with
Data Governance teams
to ensure accurate metadata and lineage tracking (e.g., via Collibra or similar tools).
Advocate for
data mesh principles
, reusable data products, and best practices across engineering and business teams.
Leverage
DevSecOps and automation tools
(Terraform, GitHub, CI/CD) to streamline deployment and operational excellence.
What we're looking for in our applicants:
1
0+ years
of hands-on experience in
data engineering in enterprise/cloud environments
.
Proven experience building
data pipelines on Azure
(Azure Data Factory, Databricks, Synapse, or equivalent).
Demonstrated ability in
Snowflake development
, including SQL performance optimisation.
Practical exposure to
Python for data workflows
and
basic analytical scripting
(pandas, data inspection, validation scripts).
Experience performing
data joins, validation, quality checks, and exploratory insights aggregation
.
Strong understanding of
data lake/data warehouse architectures
and modern data platform patterns.
Skilled in
DevOps/Infra-as-Code tooling
(Terraform, GitHub, CI/CD pipelines) and automated deployment strategies.
Ability to
translate complex technical concepts into clear business-relevant insights
.
Experience collaborating with multidisciplinary teams ,including
Data Scientists, Analysts, Governance, and Product
stakeholders.
Why Keyrus?
Joining Keyrus means joining a market leader in the Data Intelligence field and an (inter)national player in Management Consultancy and Digital Experience.
You will be part of a young and ever learning enterprise with an established international network of thought leading professionals driven by bridging the gap between innovation and business. You get the opportunity to meet specialised and professional consultants in a multicultural ecosystem.
Keyrus gives you the opportunity to showcase your talents and potential, to build up experience through working with our clients, with the opportunity to grow depending on your capabilities and affinities, in a great working and dynamic atmosphere.
Keyrus UK Benefits:
Competitive holiday allowance
Very comprehensive Private Medical Plan
Flexible working patterns
Workplace Pension Scheme
Sodexo Lifestyle Benefits
Discretionary Bonus Scheme
Referral Bonus Scheme
* Training & Development via KLX (Keyrus Learning Experience)
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.