Software Development Engineer (sde2) Data Integration Engineer

Year    MH, IN, India

Job Description

Experience

: 2 to 4


Location

: Pune


Qualification

: B. E/ B. Tech (Comp/ IT)


Technical Know-how

: ETL, Talend, DBT Data Build Tool, Data Lakehouse, Complex SQL


Responsibilities:




Contribute to end-to-end data integration efforts, including requirements analysis, solution design, development, and support of the technology stack. Design, develop, and optimize ETL/ELT pipelines using

Talend, DBT,

and other modern data integration tools. Participate in the full data integration lifecycle -- requirements gathering, design, development, testing, deployment, and production support. Collaborate with cross-functional teams to deliver scalable and reusable ETL frameworks supporting analytical and reporting products. Assist in integrating event-driven streaming technologies (Kafka, CDC tools such as GoldenGate or Qlik Replicate) for near real-time data processing. Perform performance analysis and optimization of ETL processes, databases, and SQL queries. Support the design and implementation of data modeling solutions following Kimball methodologies (Star Schema, Snowflake Schema, SCD Type 1 & 2). Prepare and maintain data mapping, transformation logic, and documentation for ETL processes. Learn and adopt best practices in coding standards, with opportunities to mentor interns or freshers. Work closely with clients and stakeholders to align deliverables with business objectives.

Requirements & Experience




Bachelor's degree in computer science, Information Technology, or related discipline.

2-4 years of experience

in Data Warehouse development and ETL/ELT design & development. Hands-on ETL coding experience in at least

1-2 projects

as a team member. Practical experience with ETL tools such as

Talend, DBT,

or similar; exposure to CDC tools (Goldengate, Qlik Replicate, etc.) is desirable. Familiarity with cloud environments (OCI, AWS, Azure) is a plus. Experience working with data file formats like

JSON, CSV, Parquet

; exposure to Iceberg table formats is an advantage. Good understanding of database concepts; experience with

Oracle, SQL Server, or Postgres

preferred. Exposure to

Kimball modeling techniques

(Star Schema, Snowflake Schema, SCD Type 1 & 2). Strong skills in writing optimized

SQL/PLSQL

programs for business scenarios. Excellent verbal and written communication skills. Ability to collaborate effectively with global teams. Exposure to cloud data platforms such as

Snowflake, Redshift

is desirable. Knowledge of reporting tools (e.g.,

Power BI

) and financial domain concepts is an added advantage. * Understanding of project management methodologies is a plus.

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD4980905
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Internship
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    MH, IN, India
  • Education
    Not mentioned
  • Experience
    Year