Devops Engineer

Year    IN, India

Job Description

DevOps Engineer - Airflow & Cloud Implementation Specialist (5-7 Years Experience)


Location: Faridabad / Remote (WFH)

Job Type: Full-time

Experience Level: Mid to Senior


At Zonda Home, we are redefining data-driven customer experiences through automation, scalability, and cloud-native architecture. We are seeking a skilled DevOps Engineer with a deep focus on Apache Airflow orchestration and cloud infrastructure implementation. In this role, you will design, deploy, and optimize data pipelines and CI/CD processes across cloud environments, supporting scalable data integration and high-performing systems.


You'll play a key role in bridging DevOps and Data Engineering, enabling robust data orchestration, reliable infrastructure automation, and cloud-first delivery models.


Key Responsibilities

Design and implement Airflow-based ETL/ELT workflows for data pipeline orchestration, scheduling, and monitoring. Manage Airflow DAGs across multiple environments, ensuring modularity, reliability, and maintainability. Deploy and scale Airflow on Docker (GKE/EKS/AKS) or VM-based cloud instances. Automate infrastructure provisioning using Terraform, or Cloud Deployment Manager. Design and manage scalable, secure architectures on GCP, AWS, or Azure (preferred: AWS). Optimize data integration between cloud-native storage (e.g., GCS, S3) and cloud Datawarehouse like Big Query, Snowflake, or Redshift. Implement CI/CD pipelines using tools like GitHub Actions, GitLab CI/CD, or Jenkins, including DAG testing and deployments. Ensure observability using tools like Prometheus, Grafana, Stack driver, and integrate alerts into collaboration platforms (e.g., Slack, Teams). Automate and manage secrets, configurations, and security policies via Vault, KMS, or Secrets Manager. Enable cost-efficient cloud usage through performance tuning, autoscaling, and budget monitoring.

Required Skills & Experience

5-7 years of experience in DevOps or cloud infrastructure roles. 5+ years of hands-on experience with Apache Airflow (authoring, deploying, managing DAGs in production). Proficiency in cloud platforms (AWS preferred; GCP/Azure also acceptable). Strong scripting skills in Python and Bash; experience with Airflow custom operators is a plus. Solid experience with Infrastructure as Code (IaC) tools like Terraform Strong understanding of data engineering principles, batch scheduling, and data reliability. Hands-on with Docker and Kubernetes for orchestrating microservices and Airflow deployments. Knowledge of CI/CD pipelines, GitOps, and automated release processes. Experience integrating Airflow with Big Query, Cloud SQL, S3/GCS, APIs, and messaging systems like Pub/Sub or Kafka. Familiarity with monitoring and alerting best practices.

Preferred Qualifications

Bachelor's degree in computer science, Data Engineering, or related technical field. Certifications in GCP, AWS, or DevOps tools (e.g., GCP Professional DevOps Engineer, CKA). Exposure to AI and ML pipeline orchestration, data quality frameworks, or metadata management tools (e.g., Great Expectations, Open Lineage) is a bonus.

Why Join Zonda Home

Own and lead the Airflow and cloud architecture strategy in a high-impact role. Work with a forward-thinking team using modern DevOps, data, and cloud-native tooling. Be part of a growing engineering culture that values automation, transparency, and innovation. * Enjoy flexible remote work, autonomy, and the opportunity to influence enterprise-scale delivery.

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD3826821
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    IN, India
  • Education
    Not mentioned
  • Experience
    Year