with over 4 years of hands-on experience in building and optimizing data pipelines and architectures using Google Cloud Platform (GCP). The ideal candidate will have strong expertise in data integration, transformation, and modeling, with a focus on delivering scalable, efficient, and secure data solutions. This role requires a deep understanding of GCP services, big data processing frameworks, and modern data engineering practices.
Key Responsibilities:
Design, develop, and deploy scalable and reliable
data pipelines
on
Google Cloud Platform
.
Build data ingestion processes from various structured and unstructured sources using
Cloud Dataflow
,
Pub/Sub
,
BigQuery
, and other GCP tools.
Optimize data workflows for performance, reliability, and cost-effectiveness.
Implement data transformations, cleansing, and validation using
Apache Beam
,
Spark
, or
Dataflow
.
Work closely with data analysts, data scientists, and business stakeholders to understand data needs and translate them into technical solutions.
Ensure data security and compliance with company and regulatory standards.
Monitor, troubleshoot, and enhance data systems to ensure high availability and accuracy.
Participate in code reviews, design discussions, and continuous integration/deployment processes.
Document data processes, workflows, and technical specifications.
Required Skills:
Minimum
4 years of experience
in data engineering with at least
2 years working on GCP
.
Strong proficiency in
GCP services
such as
BigQuery
,
Cloud Storage
,
Dataflow
,
Pub/Sub
,
Cloud Composer
,
Cloud Functions
, and
Vertex AI
(preferred).
Hands-on experience in
SQL
,
Python
, and
Java/Scala
for data processing and transformation.
Experience with
ETL/ELT
development, data modeling, and data warehousing concepts.
Familiarity with
CI/CD pipelines
, version control (Git), and DevOps practices.
Solid understanding of data security, IAM, encryption, and compliance within cloud environments.
Experience with performance tuning, workload management, and cost optimization in GCP.
Preferred Qualifications:
GCP Professional Data Engineer Certification.
Experience with real-time data processing using
Kafka
,
Dataflow
, or
Pub/Sub
.
Familiarity with
Terraform
,
Cloud Build
, or infrastructure-as-code tools.
Exposure to data quality frameworks and observability tools.
Previous experience in an agile development environment.
Job Types: Full-time, Permanent
Pay: ?473,247.51 - ?2,000,000.00 per year
Schedule:
Monday to Friday
Application Question(s):
Mention Your Last Working Date
Experience:
Google Cloud Platform: 4 years (Preferred)
Python: 4 years (Preferred)
ETL: 4 years (Preferred)
Work Location: In person
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.