to lead the design and development of scalable data solutions. You'll architect robust pipelines, orchestrate workflows, and enable seamless
cross-cloud data movement
to power advanced analytics and machine learning initiatives.
Key Responsibilities
Design and implement scalable data pipelines using
Dataflow
and
Python
Build and optimize data warehouses with
BigQuery
Orchestrate workflows using
Cloud Composer (Airflow)
Integrate real-time data streams via
Pub/Sub
Architect and manage
cross-cloud data movement
strategies (e.g., GCP ? Azure/AWS)
Collaborate with data scientists, analysts, and platform engineers to deliver high-quality datasets
Ensure data governance, security, and performance across systems
Required Skills
5-7 years of experience in data engineering, with strong GCP exposure
Proficiency in
BigQuery
,
Dataflow
,
Cloud Composer
, and
Pub/Sub
Advanced
Python
skills for data processing and automation
Experience with
cross-cloud architecture
and data migration strategies
Solid understanding of data modeling, partitioning, and performance optimization
Familiarity with CI/CD and version control (e.g., Git)
Bonus Skills
Experience with
Databricks
,
Delta Lake
, or
Apache Beam
Knowledge of
Terraform
or infrastructure-as-code tools
Exposure to
Azure Data Factory
or
AWS Glue
Why Join Us
Work on high-impact, enterprise-scale data projects
Flexible work culture with global collaboration
Opportunity to shape multi-cloud data architecture
Competitive compensation and career growth
Job Types: Full-time, Contractual / Temporary, Freelance
Contract length: 6 months
Pay: ₹80,000.00 - ₹100,000.00 per month
Work Location: Remote
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.