Snowflake Developer With Oracle Golden Gate/ Data Engineer

Year    MH, IN, India

Job Description

Job Title : Snowflake Developer with Oracle Golden Gate/ Data Engineer


About Oracle FSGIU - Finergy:


The Finergy division within Oracle FSGIU is dedicated to the Banking, Financial Services, and Insurance (BFSI) sector. We offer deep industry knowledge and expertise to address the complex financial needs of our clients. With proven methodologies that accelerate deployment and personalization tools that create loyal customers, Finergy has established itself as a leading provider of end-to-end banking solutions. Our single platform for a wide range of banking services enhances operational efficiency, and our expert consulting services ensure technology aligns with our clients' business goals


Responsibilities:


Snowflake Data Modeling & Architecture:

Design and implement scalable Snowflake data models using best practices such as Snowflake Data Vault methodology.

Real-Time Data Replication & Ingestion:

Utilize Oracle GoldenGate for Big Data to manage real-time data streaming and optimize Snowpipe for automated data ingestion.

Cloud Integration & Management:

Work with AWS services (S3, EC2, Lambda) to integrate and manage Snowflake-based solutions.

Data Sharing & Security:

Implement SnowShare for data sharing and enforce security measures such as role-based access control (RBAC), data masking, and encryption.

CI/CD Implementation:

Develop and manage CI/CD pipelines for Snowflake deployment and data transformation processes.

Collaboration & Troubleshooting:

Partner with cross-functional teams to address data-related challenges and optimize performance.

Documentation & Best Practices:

Maintain detailed documentation for data architecture, ETL processes, and Snowflake configurations.

Performance Optimization:

Continuously monitor and enhance the efficiency of Snowflake queries and data pipelines.
Mandatory Skills:


Should have 4 years of experience as Data Engineer Strong expertise in

Snowflake architecture, data modeling, and query optimization

. Proficiency in

SQL

for writing and optimizing complex queries. Hands-on experience with

Oracle GoldenGate for Big Data

for real-time data replication. Knowledge of

Snowpipe

for automated data ingestion. Familiarity with

AWS cloud services

(S3, EC2, Lambda, IAM) and their integration with Snowflake. Experience with

CI/CD tools

(e.g., Jenkins, GitLab) for automating workflows. Working knowledge of

Snowflake Data Vault methodology

.
Good to Have Skills:


Exposure to

Databricks

for data processing and analytics. Knowledge of

Python or Scala

for data engineering tasks. Familiarity with

Terraform or CloudFormation

for infrastructure as code (IaC). Experience in

data governance and compliance best practices

. Understanding of

ML and AI integration with data pipelines

.
Self-Test Questions:


Do I have hands-on experience in designing and optimizing Snowflake data models? Can I confidently set up and manage real-time data replication using Oracle GoldenGate? Have I worked with Snowpipe to automate data ingestion processes? Am I proficient in SQL and capable of writing optimized queries in Snowflake? Do I have experience integrating Snowflake with AWS cloud services? Have I implemented CI/CD pipelines for Snowflake development? Can I troubleshoot performance issues in Snowflake and optimize queries effectively? Have I documented data engineering processes and best practices for team collaboration?

Responsibilities:


Snowflake Data Modeling & Architecture:

Design and implement scalable Snowflake data models using best practices such as Snowflake Data Vault methodology.

Real-Time Data Replication & Ingestion:

Utilize Oracle GoldenGate for Big Data to manage real-time data streaming and optimize Snowpipe for automated data ingestion.

Cloud Integration & Management:

Work with AWS services (S3, EC2, Lambda) to integrate and manage Snowflake-based solutions.

Data Sharing & Security:

Implement SnowShare for data sharing and enforce security measures such as role-based access control (RBAC), data masking, and encryption.

CI/CD Implementation:

Develop and manage CI/CD pipelines for Snowflake deployment and data transformation processes.

Collaboration & Troubleshooting:

Partner with cross-functional teams to address data-related challenges and optimize performance.

Documentation & Best Practices:

Maintain detailed documentation for data architecture, ETL processes, and Snowflake configurations. *

Performance Optimization:

Continuously monitor and enhance the efficiency of Snowflake queries and data pipelines.

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD4018421
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    MH, IN, India
  • Education
    Not mentioned
  • Experience
    Year