Aws Data Engineer

Year    TN, IN, India

Job Description

AWS Data Engineer



Responsibilities:

Lead the architectural design and development of a scalable, reliable, and flexible metadata-driven data ingestion and extraction framework on AWS using Python/PySpark.

Design and implement a customizable data processing framework using Python/PySpark. This framework should be capable of handling diverse scenarios and evolving data processing requirements.

Implement data pipeline for data Ingestion, transformation and extraction leveraging the AWS Cloud Services

Seamlessly integrate a variety of AWS services, including S3,Glue, Kafka, Lambda, SQL, SNS, Athena, EC2, RDS (Oracle, Postgres, MySQL), AWS Crawler to construct a highly scalable and reliable data ingestion and extraction pipeline.

Facilitate configuration and extensibility of the framework to adapt to evolving data needs and processing scenarios.

Develop and maintain rigorous data quality checks and validation processes to safeguard the integrity of ingested data.

Implement robust error handling, logging, monitoring, and alerting mechanisms to ensure the reliability of the entire data pipeline.



Qualifications:



Must Have:


Over 6 years of hands-on experience in data engineering, with a proven focus on data ingestion and extraction using Python/PySpark.

Extensive AWS experience is mandatory, with proficiency in Glue, Lambda, SQS, SNS, AWS IAM, AWS Step Functions, S3, and RDS (Oracle, Aurora Postgres).

4+ years of experience working with both relational and non-relational/NoSQL databases is required.

Strong SQL experience is necessary, demonstrating the ability to write complex queries from scratch.

Strong working experience in Redshift is required along with other SQL DB experience.

Strong scripting experience with the ability to build intricate data pipelines using AWS serverless architecture.


Complete understanding of building an end-to end Data pipeline.



Nice to have:

Strong understanding of Kinesis, Kafka, CDK.

A strong understanding of data concepts related to data warehousing, business intelligence (BI), data security, data quality, and data profiling is required.

Experience in Node Js and CDK.


Experience with Kafka and ECS is also required.

About Virtusa





Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth -- one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us.



Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence.



Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD3795740
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    TN, IN, India
  • Education
    Not mentioned
  • Experience
    Year