Principal Data Engineer

Year    Remote, IN, India

Job Description

Experience Required: 8+ Years



Roles & Responsibilities:-

Lead the end-to-end architecture and development of a modern enterprise-scale data platform from the ground up.

Collaborate with cloud and security architects to ensure platform scalability, performance, and compliance.

Architect, design, and implement batch and real-time streaming data infrastructures and workloads.

Build and maintain data lakehouse architectures within the GCP ecosystem.

Design and develop connector frameworks and reusable data ingestion pipelines to source data from both on-premises and cloud systems.

Architect and implement metadata management, including data catalogs, lineage, quality, and observability frameworks.

Design and develop data quality frameworks and governance processes to ensure reliability and accuracy.

Develop microservices-based components using Kubernetes, Docker, and Cloud Run to abstract platform and infrastructure complexities.

Design and optimize data storage, transformation, and querying performance for large-scale datasets while ensuring cost efficiency.

Implement observability tooling (Grafana, Datadog) and DataOps best practices, including CI/CD and test automation.

Collaborate with data scientists and analysts to define data models, schemas, and advanced analytics capabilities.

Drive deployment, release management, and platform scalability initiatives.

Stay ahead of emerging data engineering trends, tools, and best practices to continuously evolve the platform.

Skills:

8+ years of proven experience in modern cloud data engineering and enterprise data platform architecture.



Demonstrated success in architecting and delivering large-scale greenfield data platform

projects.

Deep expertise in Google Cloud Platform (GCP) and its ecosystem -- BigQuery, Cloud

Functions, Cloud Run, Dataform, Dataflow, Dataproc, SQL, Python, and Airflow.

Strong understanding of streaming technologies such as Kafka or Pub/Sub.

Hands-on experience with microservices architectures using Kubernetes, Docker, and Cloud Run.

Proven ability to design semantic layers and metadata-driven architectures.

Expertise in data modeling, data architecture, and data governance principles.

Experience with observability and monitoring tools (Grafana, Datadog).

Strong understanding of DataOps principles, including automation, CI/CD, and testing for data

pipelines.

Experience architecting secure, scalable, and high-performance data solutions.

Job Type: Contractual / Temporary
Contract length: 3 months

Pay: ₹130,000.00 - ₹160,000.00 per month

Experience:

work: 8 years (Required)
Work Location: Remote

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD4628379
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Remote, IN, India
  • Education
    Not mentioned
  • Experience
    Year