Aws Data Engineer

Year    TN, IN, India

Job Description

Dear Candidate,



Greetings of the day!!



I am Amutha and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/amutha-valli-32611b289/ Or Email: amutha.m@techmango.net



Techmango Technology Services

is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology.

We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra "Clients Vision is our Mission".

We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy.

Role: AWS Data Engineer


Key Skills: Data Engineer, Python, AWS services ,SQL, ETL, pyspark, Python



Experience: 4 to 10 Yrs


Location: Madurai


Mode : Hybrid / WFO



Job Title: AWS Data Engineer



About the Role:


We're looking for a seasoned

AWS Data Engineer

to join our Engineering organization and take ownership of the cloud-based systems that power our most critical data-driven decisions. You'll design, architect, and maintain high-performance, scalable data pipelines and cloud data warehouses using AWS-native services such as Redshift, Glue, S3, Lambda, and Step Functions. This is a high-impact role ideal for someone who thrives in solving complex data challenges, optimizing distributed systems, and being the go-to expert in a collaborative and high-performance engineering team.

What You Will Do:



Architect, build, and maintain robust and scalable data pipelines using AWS services.

Implement performant and reliable ETL/ELT processes that handle large volumes of structured and unstructured data.

Enforce and monitor data SLAs to ensure freshness, reliability, and availability of datasets across environments.

Collaborate with engineering, product, and analytics teams to transform business requirements into robust data models and pipelines.

Proactively identify and resolve bottlenecks, data quality issues, and system inefficiencies.

Implement schema versioning, data lineage tracking, and database change management practices.

Define and enforce best practices for data governance, access control, observability, and compliance.

Contribute to CI/CD workflows and infrastructure as code practices using tools like CloudFormation or Terraform.


What You Will Bring:



4+ years of experience in data engineering or backend systems development, with a strong focus on cloud-based architectures.

Deep expertise in AWS data ecosystem--especially Redshift, Glue, S3, Athena, Lambda, Step Functions, and CloudWatch.

Strong background in SQL performance tuning, schema design, indexing, and partitioning strategies for large datasets.

Experience maintaining data freshness SLAs and end-to-end ownership of production pipelines.

Hands-on experience with Python (or PySpark), T-SQL, and scripting automation for data ingestion and transformation.

Solid understanding of relational and dimensional data modeling, normalization, and schema evolution.

Experience with source control systems (e.g., Git, Bitbucket) and CI/CD pipelines for data infrastructure.

Track record of transforming complex business requirements into reliable and scalable data solutions.

Experience with data governance, security, and compliance frameworks (e.g., HIPAA, GDPR) is a plus.

Familiarity with monitoring and observability tools (e.g., CloudWatch, Datadog, or Prometheus).

Bonus: Exposure to Snowflake or MSSQL in hybrid cloud environments.


Nice to Have:



AWS certifications such as AWS Certified Data Analytics, Solutions Architect, or DevOps Engineer.

Experience with Apache Airflow, dbt, or other data orchestration tools.

Familiarity with Kafka, Kinesis, or other streaming technologies.

Understanding of data mesh or data lakehouse architectures.


Job Type: Full-time

Pay: ?500,000.00 - ?1,200,000.00 per year

Application Question(s):

Overall work experience: How many years of experience do you have in Data Engineer ? How many years of experience do you have in AWS ? How many years of experience do you have in Python / Pyspark ? How many years of experience do you have in SQL / ETL? May I know your last working date ? Whether are you comfortable to work in Madurai location like hybrid mode (Monthly 1 or 2 weeks in madurai office and 2 weeks in remote mode) ?
Work Location: In person

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD4799258
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    TN, IN, India
  • Education
    Not mentioned
  • Experience
    Year