to design, develop, and maintain efficient ETL solutions. The ideal candidate will be responsible for implementing data integration solutions, managing large datasets, and enabling data analytics by building robust data pipelines across various platforms.
Key Responsibilities:
Design, develop, and implement ETL solutions using
Informatica PowerCenter
, focusing on data extraction, transformation, and loading from multiple source systems into Snowflake and Teradata environments.
Develop and execute
BTEQ scripts
for Teradata database interactions, including data load, export, and monitoring jobs.
Work on data integration tasks to move data from various sources into
Snowflake Data Warehouse
, leveraging Snowflake-specific features such as stages, file formats, and streams.
Optimize ETL mappings, workflows, and BTEQ scripts for improved performance and reliability.
Implement data quality checks, data profiling, and data validation as part of the ETL process.
Collaborate with business analysts, data architects, and other technical teams to understand requirements and deliver scalable data solutions.
Troubleshoot and resolve ETL failures, data discrepancies, and performance bottlenecks.
Manage and monitor daily/weekly ETL jobs to ensure timely data delivery.
Maintain clear and detailed documentation of data mappings, workflows, BTEQ scripts, and technical designs.
Participate in unit testing, system integration testing, and production support activities.
Key Skills Required:
Strong hands-on experience in
Informatica PowerCenter
development (mappings, workflows, sessions, etc.).
In-depth knowledge of
Snowflake Data Warehouse architecture and functionality
(tables, stages, file formats, Snowpipe, etc.).
Experience with
Teradata Database
, including writing efficient SQL queries, BTEQ scripting, and handling large datasets.
Proficiency in
BTEQ (Basic Teradata Query)
for data load, export, and automation.
Strong SQL skills (Teradata SQL, Snowflake SQL) and performance tuning capabilities.
Experience in data modeling concepts (Star Schema, Snowflake Schema, Fact and Dimension Tables).
Knowledge of UNIX/Linux scripting for automating ETL and database jobs.
Strong problem-solving skills with attention to detail.
Good communication and collaboration abilities.
Educational Qualification:
Bachelor's degree in Computer Science, Information Technology, or related discipline.
Preferred (Optional):
Experience in Cloud Platforms (AWS, Azure)
Familiarity with version control tools (Git)
Exposure to workflow automation tools (Control-M, Autosys)
Job Type: Full-time
Pay: ₹500,000.00 - ₹1,500,000.00 per year
Benefits:
Health insurance
Leave encashment
Provident Fund
Work from home
Work Location: In person
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.