Data Modeler With Sql

Year    KA, IN, India

Job Description

Project description



Support one of Australia's leading banks in modernizing their customer entitlements platform and Identity and Access Management. The current landscape includes multiple data sources, databases, and hundreds of Oracle Stored Procedures fronted by web services. The goal is to re-architect these into a single, unified platform. This is an engineering role (not a BA), requiring hands-on data modeling, manifest-driven mapping, and delivery of working solutions. Candidates are expected to be in the office 50% of the time (Sydney-based hybrid model).


Responsibilities




Map the current target state using a manifest-driven approach; propose and implement schema changes.


Design and evolve models across relational, graph, and NoSQL databases.


Refactor Oracle PL/SQL logic and deliver migrations, pipelines, and automated tests.


Build and optimise solutions on AWS (RDS, S3, Lambda, ECS, EKS, IAM, Neptune if applicable).


Automate validations and CI/CD steps using Python and Groovy.


Partner with IAM, API, and platform teams to ensure compliance, auditability, and secure entitlements.


Skills




Must have


Data modelling and SQL/PLSQL


AWS data services and automation


Python and Groovy scripting


Graph DB (Neptune or Neo4j) and NoSQL databases


Ping Manifest/schema and policies governance


IAM principles and audit compliance


Strong data modelling across conceptual, logical, and physical layers with clear artefacts (ER models, graph schemas, JSON/Avro/DDL)


Hands-on experience with SQL and PL/SQL for reading/refactoring stored procedures and writing performant queries


Practical experience with AWS services, including RDS, S3, Lambda, IAM/Secrets Manager, CloudWatch/CloudTrail, and Amazon Neptune.


Proficiency in Python for data engineering, validation, profiling, and reconciliation tasks; Groovy for scripting and pipeline integration (e.g., GitHub actions)


Experience with manifest-driven mapping or schema registry patterns; ability to govern schema changes and versioning


Understanding of IAM models (RBAC/ABAC), PII handling, and audit/lineage requirements in regulated environments


Nice to have


Proficient with Ping Products


IDM, Authorize, Directory, Federate


Experience with Cypher (Neo4j), Gremlin (TinkerPop), or SPARQL for graph traversal and entitlements modeling


Exposure to Terraform or AWS CDK for infrastructure as code in data platforms


Performance tuning experience with Redshift or Aurora


Familiarity with metadata/lineage tooling and data contract frameworks


Banking domain experience, especially in entitlements and authorization


Knowledge of API/service integration patterns and schema registries


Other




Languages


English: C1 Advanced


Seniority


Senior



Bengaluru, India


Req. VR-117328


Data Modeling


BCM Industry


11/09/2025


Req. VR-117328

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD4225706
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    KA, IN, India
  • Education
    Not mentioned
  • Experience
    Year