Enterprise Data Architect

Year    KA, IN, India

Job Description

Job Information




Number of Positions


1
Date Opened


11/20/2025
Job Type


Full time
Industry


IT Services
Work Experience


12+ years
Last Activity Time


12/01/2025 13:44
City


Bangalore
State/Province


Karnataka
Country


India
Zip/Postal Code


560068




Enterprise Data Architect -



Location:

Bangalore

Experience

: 15+ years in Data Engineering & Data Platforms



Employment Type: Full-time

About Aptus Data Labs





Aptus Data Labs is a global Data Engineering and AI solutions partner helping enterprises build modern, scalable, and intelligence-driven organizations. With deep expertise across cloud platforms, advanced analytics, AI/ML, and enterprise data transformation, we empower businesses to unlock the full value of their data. Our focus on innovation, domain excellence, and engineering quality enables us to deliver high-impact platforms--ranging from enterprise data lakes to AI-driven automation and industry-specific solutions. Trusted by leading companies across the US, India, Africa, and Europe, Aptus Data Labs is committed to shaping future-ready digital ecosystems that drive growth, efficiency, and strategic advantage.

About the Role





We are seeking a highly accomplished Enterprise Data Architect with deep expertise in designing modern data platforms, integrating complex enterprise datasets, and driving large-scale digital and AI initiatives. The ideal candidate brings 15+ years of strong experience in data engineering, data platforms, data governance, data integration, and data operations, with 5+ years of hands-on Databricks Lakehouse implementation on AWS and strong Reltio MDM experience.



This role is instrumental in shaping the enterprise data foundation, leading multi-domain integrations, and enabling AI-ready architectures across global teams in the US, India, and Ireland.

Key Responsibilities





1. Lead the enterprise data architecture strategy, focusing on scalability, modernization, interoperability, and business alignment.



2. Architect and operationalize Databricks Lakehouse solutions on AWS, including ingestion, transformation, orchestration, governance, and consumption layers.



3. Design and implement Medallion architecture (Bronze-Silver-Gold) with 100+ source integrations using Boomi Integrator and Databricks pipelines.



4. Drive enterprise Master Data Management (MDM) using Reltio, including entity modeling, data quality, match-merge rules, survivorship, workflows, and golden record stewardship.



5. Establish frameworks for metadata management, data quality, lineage, cataloging, and governance, leveraging Unity Catalog and AWS-native security.



6. Enable AI and analytics teams by building AI-ready datasets, feature stores, and GenAI-supporting data pipelines.



7. Provide leadership, mentorship, and architectural oversight for data engineering, governance, and platform teams.



8. Implement enterprise standards for data security, IAM, compliance (GDPR/HIPAA), observability, and cloud cost optimization.



9. Collaborate with global business stakeholders across the US, India, and Ireland to drive data modernization, cloud migration, and aligned domain strategies.

Requirements



Required Skills & Experience





15+ years of hands-on experience in data engineering, data platforms, data integration, data governance, and data operations. Strong hands-on experience with

Reltio MDM

, including configuration, hierarchy management, entity modeling, match/merge, and golden record creation. 5+ years of solid expertise in

Databricks Lakehouse (Delta Lake, PySpark, Unity Catalog, MLflow, and Databricks AI).

Proven expertise in

AWS data ecosystem: S3, Glue, EMR, Lambda, Athena, Redshift, Lake Formation, IAM.

Strong experience in

implementing Medallion architecture

across 100+ data sources using:
+ Boomi Integrator
+ Databricks pipelines (batch and streaming)
Advanced proficiency in SQL, Python, PySpark, and API-driven integrations. Deep understanding of

data governance, metadata, lineage, observability, and MDM frameworks.

Experience with modern data stack tools (Snowflake, dbt, Airflow, Kafka) is an advantage. Excellent communication skills, both verbal and written, with the ability to collaborate effectively with teams across the US, India, and Ireland. AWS or Databricks certifications preferred.

Education





Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or a related field.

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD4838007
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    KA, IN, India
  • Education
    Not mentioned
  • Experience
    Year