Aws Data Engineer

Year    TN, IN, India

Job Description

Apache Airflow (MWAA)


Experience Level: 6 to 10 Years




Key Responsibilities


Workflow Design & Development: Build, deploy, and optimize DAGs (Directed Acyclic Graphs) on AWS MWAA to orchestrate data pipelines, ETL processes, and automation tasks.


Workflow Orchestration: Implement scheduling, monitoring, troubleshooting, and performance tuning for Airflow jobs running in MWAA.


AWS Resource Integration: Leverage AWS services including S3, Redshift, EMR, Lambda, Glue, and RDS within Airflow workflows.


Automation: Design and implement CI/CD pipelines for seamless deployment and updates of Airflow DAGs and plugins.


Monitoring & Logging: Configure and maintain integration with Amazon CloudWatch for monitoring workflow execution, logging, and alerting.


Security & Compliance: Use IAM roles and VPCs to enable secure access and networking for Airflow tasks, ensuring compliance with data governance policies.


Documentation & Collaboration: Create clear documentation and collaborate with analytics, engineering, and business teams to deliver high-quality data solutions.


Maintenance & Upgrades: Manage Airflow environment upgrades, scalability, and patching in MWAA.




Required Skills & Qualifications


Experience: 4 to 10 years in data engineering or related field, with direct experience managing Airflow pipelines, preferably on AWS MWAA.


Programming Languages: Strong proficiency in Python (Airflow DAGs), SQL, and scripting for data tasks.


AWS Services: Hands-on expertise with AWS data and storage services (S3, Lambda, EMR, Glue, Redshift, RDS).


DevOps: Familiarity with CI/CD tools, version control (Git), and infrastructure as code practices is a plus.


Security: Understanding of IAM, VPC, networking, and data security best practices in AWS.


Problem Solving: Excellent debugging, troubleshooting, and analytical skills for distributed workflows.


Communication: Clear and effective communicator, able to convey technical concepts to cross-functional teams.




Preferred Skills


Experience with big data frameworks (Spark, Hadoop)


Exposure to containerization technologies (Docker, Kubernetes)


Familiarity with other orchestration tools (Luigi, Prefect)


Experience in cloud migration and workflow optimization




Education


Bachelors or Masters degree in Computer Science, Information Technology, Engineering, or a related discipline, or equivalent experience.

About Virtusa





Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth -- one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us.



Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence.



Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD4259075
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    TN, IN, India
  • Education
    Not mentioned
  • Experience
    Year