just keep up with technology--we set the pace. AI and digital innovation are redefining industries, and we're
leading the charge. Genpact's AI Gigafactory, our industry-first accelerator, is an example of how
we're scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies' most complex challenges.
If you thrive in a fast-moving, innovation-driven environment, love building and deploying
cutting-edge
AI solutions, and want to push the boundaries of what's
possible, this is your moment.
Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and
cutting-edge
solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook.
Inviting applications for the role of
Manager -Data Engineer
( AWS
, Python, Spark, Databricks for ETL-Agentic AI!
In this role,
you'll
be part of Genpact's transformation under GenpactNext
, as we lead the shift to Agentic AI Solutions--domain-specific, autonomous systems that redefine how we deliver value to clients.
You'll help drive the adoption of innovations like the Genpact AP Suite in finance and accounting, with more Agentic AI products set to expand across service lines.
Responsibilities
Design, develop, and manage scalable ETL pipelines using
AWS Glue
,
Databricks
,
Apache Spark
, and
Python
to process structured and unstructured data from diverse sources.
Build and orchestrate data workflows integrating with services such as
AWS Lambda
,
Step Functions
,
S3
, and
Redshift
, ensuring high availability and performance.
Optimize
Spark jobs for performance and cost-efficiency across
Databricks
and
AWS Glue
environments using partitioning, job bookmarks, and dynamic frame operations.
Maintain secure data solutions in AWS, leveraging
IAM roles
,
KMS encryption
, and
VPC-based security
to meet compliance and governance standards.
Migrate legacy ETL jobs and data from on-prem systems to cloud-native architectures on
AWS Glue
,
Redshift
, and
DynamoDB
.
Implement/monitor
data pipeline performance
, performing debugging and tuning of Spark jobs to ensure reliable execution and minimal downtime.
Contribute
in
the design and review of technical solutions, translating business requirements and user stories into scalable data engineering architectures.
Conduct unit testing and data validation to ensure functional correctness of pipelines before deployment.
Contribute to production deployment and collaborate with release management to ensure seamless delivery of data solutions.
Recommend cost-effective, secure, and high-performing cloud-based data solutions, reducing manual overhead and operational burden.
Qualifications we
seek
in you!
Minimum Qualifications
Experience in designing, implementing data pipelines, build data applications, data migration on AWS
Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift
Experience of Databricks will be added advantage
Strong experience in Python and SQL
Proven
expertise
in AWS services such as S3, Lambda, Glue, EMR, and Redshift.
Advanced programming skills in Python for data processing and automation.
Hands-on experience with Apache Spark for large-scale data processing.
Proficiency
in SQL for data querying and transformation.
Strong understanding of security principles and best practices for cloud-based environments
.
Experience with
monitoring
tools and implementing proactive measures to ensure system availability and performance
.
Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment
.
Strong communication
and collaboration skills to work effectively with cross-functional teams
.
Preferred Qualifications/ Skills
Bachelor's degree in business information systems (IS), computer science or related field, or equivalent-related IT experience.
AWS Data Engineering & Cloud certifications, Databricks certifications
Familiar with multiple data integration technologies and cloud platforms
Knowledge of Change & Incident Management process
Why join Genpact?
Lead AI-first transformation - Build and scale AI solutions that redefine industries Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career--Gain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up.
Let's build tomorrow together.
Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation.
Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.