Principal Data Engineer

Year    KA, IN, India

Job Description

Organization:

At CommBank, we never lose sight of the role we play in other people's financial wellbeing. Our focus is to help people and businesses move forward to progress. To make the right financial decisions and achieve their dreams, targets, and aspirations. Regardless of where you work within our organisation, your initiative, talent, ideas, and energy all contribute to the impact that we can make with our work. Together we can achieve great things.


Job Title:

Principal Data Engineer


Location:

Bangalore


Business & Team:



The Technology Team is responsible for the world-leading application of technology and operations across every aspect of CommBank, from innovative product platforms for our customers to essential tools within our business. We also use technology to drive efficient and timely processing, an essential component of great customer service. CommBank is recognised as leading the industry in IT and operations with its world-class platforms and processes, agile IT infrastructure, and innovation in everything from payments to internet banking and mobile apps.


The Group Security (GS) team protects the Bank and our customers from cyber compromise, through proactive management of cyber security, privacy, and operational risk. Our team includes:


Cyber Strategy & Performance Cyber Security Centre Cyber Protection & Design Cyber Delivery Cyber Data Engineering Cyber Data Security Identity & Access Technology Fraud Management
Our Group Security Data Engineering team leads the way to adopt the Group's new strategic cloud data platform Commbank.Data. On this platform we are accountable for driving Group Security's data driven and proactive security function through advanced analytics and agentic AI workflows.


Impact & Contribution:



To ensure the Group achieves a sustainable competitive advantage through data engineering, you will play a key role in supporting and executing the Group Security's data strategy. In this role, you will be responsible for setting up the Group Security Data Platform to ingest data from various organizations' security telemetry data, along with additional data assets and data products. This platform will provide security controls and services leveraged across the Group.


Roles & Responsibilities:



You will be expected to perform the following tasks in a manner consistent with CBA's Values and People Capabilities.


Core Responsibilities:



Possesses hands-on technical experience working in AWS. The individual should possess a robust set of technical and soft skills and be an excellent AWS Data Engineer with a focus on complex Automation and Engineering Framework development. Being well-versed in Python is mandatory, and experience in developing complex frameworks using Python is required. Passionate about Cloud/DevSecOps/Automation and possess a keen interest in solving complex problems systematically. Drive the development and implementation of scalable data solutions and data pipelines using various AWS services. Possess the ability to work independently and collaborate closely with team members and technology leads. Exhibit a proactive approach, constantly seeking innovative solutions to complex technical challenges. Can take responsibility for nominated technical assets related to areas of expertise, including roadmaps and technical direction. Can own and develop technical strategy, overseeing medium to complex engineering initiatives. Lead the design and architecture of large-scale data systems, ensuring they meet business needs and performance requirements. Provide strategic direction for data engineering initiatives, aligning them with organizational goals. Mentor and guide senior data engineers, fostering a culture of continuous improvement and innovation. Collaborate with executive leadership to define and prioritize data engineering projects. Ensure the implementation of best practices in data management, security, and governance. Act as a System and Data Architect, designing and implementing robust, scalable, and efficient data architectures. Demonstrate capabilities to work in a fast-paced, high-pressure environment while delivering high-quality results on time. Conduct and drive multiple extensive Proof of Concepts (PoCs) simultaneously, focusing on research, automation, and framework development. Bring in the latest and greatest cloud-based solutions, ensuring the organization stays at the forefront of technology. Experience or exposure to Generative AI, Large Language Models (LLM), and Agentic AI is highly desirable. Highly technical and hands-on coding experience on multiple tech stacks related to Data Engineering and framework development on and for AWS Cloud.

Essential Skills:



15-18 years of experience as a Data Engineering professional in a data-intensive environment. The individual should have strong analytical and reasoning skills in the relevant area. Proficiency in AWS cloud services, specifically EC2, S3, Lambda, Athena, Kinesis, Redshift, Glue, EMR, DMS, Event Bridge, DynamoDB, IAM, Secret Manager, KMS, Step functions, SQS, SNS, Experience in Cloud Foundation (CFN), Cloud Development Kit (CDK) and Software Development Kit (SDK).Exposure to MSK, Flink, Iceberg [good to have] Cloud Watcher, User Notification, Appflow, Fargate, ECS, EKS, Airflow, Boto3, RDS, Sagemaker, IAM, , Cookie Cutter. Excellent skills in Python-based framework development are mandatory. Proficiency in SQL for efficient querying, managing databases, handling complex queries, and optimizing query performance. Excellent automation skills are expected in areas such as: + Automating the testing framework and various test cases including unit, integration, functional tests, and mock-ups using tools such as PyPy, Pytest
+ Modularisation and automating the data pipeline and expediting tasks such as data ingestion and transformation using various tools like DBT.
+ API-based automated and integrated calls (REST, cURL, authentication & authorization, tokens, pagination, open Api, Swagger).
+ Implementing advanced engineering techniques and handling ad hoc requests to automate processes on demand.
+ Implementing automated and secured file transfer protocols like XCOM, FTP, SFTP, and HTTP/S.
Experience with Terraform, Jenkins, Teamcity and Artifactory is essential as part of DevOps. Additionally, Docker and Kubernetes are also considered. Proficiency in building orchestration workflows using Apache Airflow. Strong understanding of streaming data processing concepts, including event-driven architectures. Familiarity with CI/CD pipeline development, such as Jenkins. Extensive experience and understanding in Data Modelling, SCD Types, Data Warehousing, and ETL processes. Excellent experience with GitHub or any preferred version control systems. Expertise in data pipeline development using various data formats/types. Mandatory knowledge and experience in big data processing using PySpark/Spark and performance optimizations of applications. Proficiency in handling various file formats (CSV, JSON, XML, Parquet, Avro, and ORC) and automating processes in the big data environment. Ability to use Linux/Unix environments for development and testing. Should be aware of security best practices to protect data and infrastructure, including encryption/decryption, tokenization, masking, firewalls, and security zones. Well-structured documentation skills and the ability to create a well-defined knowledge base. Should be able to perform extreme engineering and design a robust, efficient, and cost-effective data engineering pipelines which are highly available and dynamically scalable on demand. Enable the systems to effectively respond to high demands and heavy loads maintaining the high throughput and high I/O performance with no data loss. Own and lead E2E Data engineering life cycle right from Requirement gathering, design, develop, test, deliver and support as part of DevSecOPS process. Must demonstrate skills and mindset to implement encryption methodologies like SSL/TLS and data encryption at rest and in transit and other data security best practices. Hands on work experience with data design tools like Erwin and demonstrate the capabilities of building data models, data warehouse, data lakes, data assets and data products. Must be able to constructively challenge the status quo and lead to establish data governance, metadata management, ask the right questions, design with right principles. Extensive experience in leading and managing large-scale data engineering teams and projects. Proven track record of delivering high-impact data solutions that drive business value. Ability to influence and drive change across the organization through data-driven insights and strategies.. Strong communication and presentation skills. The individual should have strong analytical and reasoning skills in the relevant area. One should possess attention to detail, ensuring data accuracy and reliability. As part of teamwork, one should collaborate effectively with other stakeholders to achieve efficient outcomes. One should be able to prioritize tasks and manage time efficiently. The candidate should be committed to continuous learning. Challenge status quo and demonstrate thinkers' ability.

Education Qualification:



Bachelor's degree or master's degree in engineering in Computer Science/Information Technology
If you're already part of the Commonwealth Bank Group (including Bankwest, x15ventures), you'll need to apply through Sidekick to submit a valid application. We're keen to support you with the next step in your career.


We're aware of some accessibility issues on this site, particularly for screen reader users. We want to make finding your dream job as easy as possible, so if you require additional support please contact HR Direct on 1800 989 696.


Advertising End Date: 29/06/2025

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD3790279
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    KA, IN, India
  • Education
    Not mentioned
  • Experience
    Year