We are seeking a talented Data Architect to design, develop, and govern our data architecture and pipelines. to join as soon as possible. The candidate will need to be collaborative, organised, think out-of-the-box, and be ready to pursue new opportunities. Most importantly, this role is for an individual who is passionate about making a difference through healthcare.
#
Budget:
- US$180~225/day rate
#
Key Responsibilities
Design and implement scalable and robust data architectures for data warehousing and enterprise data platforms.
Develop and optimize data pipelines and ETL/ELT/CDC (Change Data Capture) workflows using tools such as
Fivetran
and
Cloud Composer
.
Collaborate with data scientists, product managers, and business stakeholders to define data requirements and create logical and physical data models.
Manage and administer various database systems, including
BigQuery
,
SAP HANA
,
and PostgreSQL
.
Ensure data quality, integrity, and security across all data platforms and pipelines.
Work with our AI/ML teams to design data serving layers and feature stores that support
Vertex AI
workloads.
Design and develop reporting frameworks and data marts to support business intelligence needs.
Integrate data platforms with various enterprise systems (CRMs, ERPs) and third-party APIs.
Define and implement data governance, master data management, and data cataloging strategies.
Contribute to the full data lifecycle: requirements gathering, architecture, data modeling, development, testing, and deployment.
Troubleshoot and resolve data platform issues to ensure high availability and optimal performance.
Document technical designs, data lineage, and architecture for cross-functional reference.
#
Required Qualifications
Bachelor's or Master's degree in Computer Science, Software Engineering, Data Science, or a related field.
Proficiency in one or more backend languages/frameworks, with a strong preference for
Python
or
Go
.
Experience with building RESTful APIs and designing microservices for data delivery.
Solid grasp of data modeling fundamentals, including Kimball and Inmon methodologies.
Proficiency in writing complex SQL queries and experience with SQL and NoSQL databases.
Familiarity with data warehousing concepts and best practices, including CDC.
Strong version-control habits (Git) and experience with CI/CD pipelines.
Excellent problem-solving, communication, and collaboration skills.
Passion for continuous learning and adapting to emerging data technologies.
#
Preferred Qualifications
Hands-on experience designing and deploying production-grade data warehouses.
Deep experience with
Google Cloud Platform (GCP)
:
BigQuery
for large-scale analytical workloads.
Cloud Composer
for orchestrating complex data pipelines.
Vertex AI
for AI/ML model serving and feature stores.
Experience with other cloud providers (AWS, Azure) and their data services.
Working knowledge of data governance frameworks, master data management, and data cataloging tools.
Experience with data ingestion tools like
Fivetran
.
Business-intelligence expertise in building dashboards and reports with
Power BI
or
Tableau
.
Familiarity with other data technologies such as
SAP HANA
.
Understanding of MLOps concepts and their application to data pipelines.
Contributions to open-source data projects or technical blogging/presentation
#
Application Process:
Please send your resume to [careers@chromeis.com] with a relevant subject line.
If your resume is shortlisted, you will be invited to take an online AI-based assessment.
Candidates who pass this test will move on to the next stage: an on-call interview with the end client.
* Successful candidates from all rounds will receive an offer based on the initial discussion during the first call.
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.