Senior Data Platform Engineer Bangalore, India

Year    KA, IN, India

Job Description

THE ROLE





As a Data Platform Engineer at MPOWER, you will be designing and scaling the cloud-native data systems that power our analytics, operations, and mission-driven decisions. This role is perfect for someone who thrives on building automated pipelines, shaping resilient infrastructure, and enabling insights that impact students around the world.



You'll join a high-impact, cross-functional team that values automation, observability, and secure-by-default architecture. You'll work closely with engineering, analytics, and product stakeholders to ensure data flows seamlessly, securely, and with purpose. Your responsibilities will include, but are not limited to:


Building, deploying, and maintaining production-grade data workflows using Apache Airflow, AWS Glue, and Lambda to support analytics and business operations Managing and provisioning cloud-native infrastructure using Terraform, including services such as Redshift, S3, RDS, Athena, and IAM Developing and maintaining automated CI/CD pipelines (e.g., Bitbucket Pipelines, GitHub Actions) to deploy data jobs and infrastructure seamlessly Implementing and tuning monitoring , alerting, and logging systems using Datadog, CloudWatch, and custom dashboards to ensure platform reliability and observability Writing and optimizing efficient Python and SQL code to support data ingestion, transformation, and workflow automation Enforcing secure data operations by applying best practices with IAM, AWS Secrets Manager, and audit-compliant access controls Partnering with cross-functional stakeholders to understand analytical and operational data needs and deliver scalable solutions Recommending and integrating tools and practices for observability, data validation, and performance optimization across the data platform

THE QUALIFICATIONS




Bachelor's degree

in Computer Science, Engineering, Information Systems, or related field

4-6 years

of experience in Data Platform, Data Engineering, DevOps, or DataOps roles Proven experience working with cloud-native data platforms, especially AWS

Strong proficiency in Python and SQL

for data transformation, automation, and workflow logic

Hands-on experience with orchestration tools

such as

Apache Airflow

,

AWS Glue

, and

Lambda

for managing ETL/ELT pipelines

Deep knowledge of AWS services

including

Redshift, S3, IAM, RDS, and Athena

, and experience managing them via

Terraform (IaC)

Proven ability to build and manage CI/CD pipelines

(e.g., Bitbucket Pipelines, GitHub Actions) for data infrastructure automation

Experience with monitoring and alerting tools

like

Datadog

and

CloudWatch

, with a focus on reliability and observability

Strong problem-solving and collaboration skills

, with the ability to work cross-functionally and communicate technical concepts clearly to non-technical stakeholders Excellent written and verbal English communication skills

A passion for financial inclusion and access to higher education is a must, as well as comfort working with a global team across multiple time zones and locations!



In addition, you should be comfortable working in a fast growth environment, meaning a small agile team, fast-evolving roles and responsibilities, variable workload, tight deadlines, a high degree of autonomy, and 80-20 everything.

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD4006252
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    KA, IN, India
  • Education
    Not mentioned
  • Experience
    Year