Chennai, Tamil Nadu, India
Department
Data Science/Big Data Mining
Job posted on
Nov 20, 2025
Employment type
Full Time Employee
We are looking for an experienced Data Solution Architect to design, architect, and implement large-scale data platforms and analytics solutions. The ideal candidate will have strong hands-on experience in Databricks, Azure, and Google Cloud Platform (GCP), along with the ability to create animated/visual data stories for technical and business audiences.
Candidate will play a key role in defining data strategy, optimizing data pipelines, ensuring platform scalability, and developing reusable frameworks. ?
Required Skills & Experience10-12+ years overall in data engineering, analytics, or architecture roles.
2-4+ years strong hands-on experience in Databricks (mandatory).
Deep experience with Azure and GCP cloud ecosystems.
Strong expertise in Spark, PySpark, SQL, Python.
Proficiency in data modeling, ETL frameworks, and distributed data systems.
Experience with streaming technologies (Kafka, Event Hub, Pub/Sub).
Ability to design and present animated architecture and data flows.
Strong communication and stakeholder management skills.
Key Responsibilities
1. Architecture & Solution DesignDesign end-to-end data lakehouse, data warehouse, and analytics architectures using Databricks, Azure, and GCP.
Build scalable and cost-optimized architectures for ingestion, transformation, streaming, and ML workloads.
Define and enforce architecture standards, patterns, and governance models.
Translate business requirements into logical and physical data models.
2. Databricks ExpertiseArchitect and optimize Databricks Lakehouse solutions using Delta Lake, Unity Catalog, Databricks SQL, MLflow, Auto Loader, and DLT.
Implement advanced transformations using PySpark, Spark SQL, and notebooks.
Design job orchestration using Databricks Workflows or other orchestration tools.
3. Cloud Platform Ownership
AzureAzure Data Factory, Azure Data Lake Storage (ADLS), Azure Synapse, Azure Functions, Event Hub, Azure DevOps, Azure Kubernetes Service (AKS).
Architecture of secure and scalable data systems on Azure.
GCPBigQuery, Cloud Storage, Dataflow, Pub/Sub, Dataproc, Cloud Composer.
Hands-on experience designing lakehouse/analytics solutions on GCP.
4. Data Engineering & IntegrationLead development of batch and real-time data pipelines using Spark, ADF, Dataflow, or Databricks Workflows.
Implement ETL/ELT frameworks, CI/CD, and reusable components.
6. Security & GovernanceImplement data governance frameworks including Unity Catalog, encryption, and compliance.
* Ensure best practices for cost optimization, monitoring, and performance tuning.
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.