Gcp Data Architect

Year    MH, IN, India

Job Description

Proficient in?data modeling, data warehousing concepts, ETL/ELT pipelines, and big data processing frameworks.
Experience with?SQL, Python, and Terraform (preferred)?for infrastructure as code.
Hands-on experience in?data security, encryption, access control, and governance?on GCP.
Experience in integrating with real-time data pipelines and event-driven architectures.
Strong understanding of?DevOps, CI/CD pipelines for data workflows, and cloud cost optimization.
GCP Professional Data Engineer / Cloud Architect certification is a plus.

Role Overview:



We are seeking a highly skilled?GCP Data Architect?with 6-8 years of experience in designing, developing, and managing enterprise data solutions on?Google Cloud Platform (GCP). The ideal candidate will have a strong background in cloud data architecture, data warehousing, big data processing, and data integration, with proven expertise in delivering scalable, secure, and efficient data platforms.

Key Responsibilities:



Design and architect end-to-end?data solutions on GCP, aligning with business and technical requirements.
Define data models, storage strategies, data ingestion, processing, and consumption frameworks.
Implement?data lakes, data warehouses, and data marts?using services like?BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub, and Composer.
Collaborate with business stakeholders, data scientists, and engineering teams to understand data needs and translate them into scalable architectures.
Design and implement?data governance, security, and compliance frameworks?for cloud-based data platforms.
Optimize data workflows, query performance, and storage costs in the GCP environment.
Lead?data migration and modernization initiatives?from on-premise or other cloud platforms to GCP.
Stay updated with GCP services, features, and industry best practices to recommend improvements and innovation.
Provide technical leadership and mentoring to data engineering teams.

Required Skills & Experience:



6-8 years?of experience in data architecture and engineering roles, with at least?3 years hands-on on GCP.
Strong expertise in?GCP data services: BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Storage, Cloud Composer, Data Catalog.
Proficient in?data modeling, data warehousing concepts, ETL/ELT pipelines, and big data processing frameworks.
Experience with?SQL, Python, and Terraform (preferred)?for infrastructure as code.
Hands-on experience in?data security, encryption, access control, and governance?on GCP.
Experience in integrating with real-time data pipelines and event-driven architectures.
Strong understanding of?DevOps, CI/CD pipelines for data workflows, and cloud cost optimization.
GCP Professional Data Engineer / Cloud Architect certification is a plus.

Good to Have

:
Exposure to AI/ML workflows, data preparation for ML models.
Experience with third-party tools like?Apache Airflow, Looker, or Dataplex.
Knowledge of other cloud platforms (AWS, Azure) for hybrid/multi-cloud strategies.

Educational Qualification:


Bachelor's or Master's degree in Computer Science, Information Technology, Data Engineering, or a related field.

Why Join Us:



Work on cutting-edge data transformation programs at scale.

Opportunity to architect high-impact solutions in a collaborative, innovation-driven environment.

Engage with a fast-growing team focused on data-driven business value.

Job Type: Full-time

Work Location: In person

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD3845351
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    MH, IN, India
  • Education
    Not mentioned
  • Experience
    Year