Founded in 2014, TechMango Technology Services is a leading software development company with a strong focus on emerging technologies. Our primary goal is to deliver strategic solutions that align with our business partners' technological needs.
We specialize in providing custom software solutions using the best available technologies, ensuring quality delivery within defined timelines. Our services include analysis, design, architecture, programming, testing, and long-term technical support, all aimed at achieving ultimate customer satisfaction.
Recognized as the Best Offshore Software Development Company in India, TechMango is driven by the mantra, "Clients' Vision is our Mission." We strive to be a technologically advanced and highly regarded organization, offering high-quality and cost-efficient services while fostering long-term client relationships.
We operate in the USA (Chicago, Atlanta), UAE (Dubai), and India (Bangalore, Chennai, Madurai, Trichy, Coimbatore).
For more information, please visit our website.https://www.techmango.net
Job Title: Technical GCP Data Engineer
Location: Madurai / Chennai (WFO / Remote)
Experience: 3+ Years
Notice Period: Immediate
Job Summary
We are seeking a
hands-on Technical GCP Data Engineer
with deep expertise in
real-time streaming data architectures
to help design, build, and optimize data pipelines in our Google Cloud Platform (GCP) environment. The ideal candidate will have strong architectural vision and be comfortable rolling up their sleeves to build
scalable, low-latency streaming data pipelines
using
Pub/Sub, Dataflow (Apache Beam)
, and
BigQuery
.
Key Responsibilities
Architect and implement end-to-end
streaming data solutions
on GCP using
Pub/Sub
,
Dataflow
, and
BigQuery
.
Design real-time ingestion, enrichment, and transformation pipelines for high-volume event data.
Work closely with stakeholders to understand data requirements and translate them into scalable designs.
Optimize streaming pipeline performance, latency, and throughput.
Build and manage orchestration workflows using
Cloud Composer (Airflow)
.
Drive schema design, partitioning, and clustering strategies in BigQuery for both real-time and batch datasets.
Define SLAs, monitoring, logging, and alerting for streaming jobs using
Cloud Monitoring
,
Error Reporting
, and
Stackdriver
.
Experience with data modeling.
Ensure robust security, encryption, and access controls across all data layers.
Collaborate with DevOps for CI/CD automation of data workflows using
Terraform
,
Cloud Build
, and
Git
.
Document streaming architecture, data lineage, and deployment runbooks.
Required Skills & Experience
3+ years of experience in data engineering or architecture.
1.5+ years of
hands-on GCP data engineering
experience.
Strong expertise in:
Google Pub/Sub
Dataflow (Apache Beam)
BigQuery (including streaming inserts)
Cloud Composer (Airflow)
Cloud Storage (GCS)
Solid understanding of
streaming design patterns
,
exactly-once delivery
, and
event-driven architecture
.
Deep knowledge of SQL and NoSQL data modeling.
Hands-on experience with
monitoring and performance tuning
of streaming jobs.
Experience using
Terraform
or equivalent for infrastructure as code.
Familiarity with
CI/CD pipelines
for data workflows.
Job Type: Full-time
Pay: ?2,000,000.00 - ?2,500,000.00 per year
Application Question(s):
Overall work experience:
How many years experience do you have in Data Engineer ?
How many years experience do you have in GCP (Bigquery) ?
How many years experience do you have in Google Pub/Sub, Dataflow (Apache Beam), BigQuery (including streaming inserts), Cloud Composer (Airflow), Cloud Storage ?(GCS) ?
Last working date:
CTCC:
ETCC:
Preferred location to work Chennai / Madurai (WFO / Remote) ?
Work Location: In person
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.