Senior Data Engineer will be responsible for building and
deploying scalable data driven solutions. Data Engineers are responsible for working
with both SQL and NoSQL technologies and collaborating with other product
development teams identifying and implementing the appropriate data repositories
based on strategical business needs.
Responsibilities for Senior Data Engineer:
? Data Pipeline Development:
? Assist in designing, developing, and maintaining data pipelines to ensure efficient
data ingestion, transformation, and storage.
? Collaborate with senior data engineers to implement ETL (Extract, Transform,
Load) processes and workflows.
? Database Management:
? Support the management and optimization of relational and NoSQL databases.
? Assist in database schema design, indexing, and performance tuning.
? Data Quality and Integrity:
? Monitor and validate data to ensure accuracy and consistency across systems.
? Help identify and resolve data quality issues and implement data cleansing
techniques.
? Data Integration:
? Assist in integrating data from various sources, including internal and external
systems, APIs, and third-party data providers.
? Support data migration and synchronization tasks as needed.
? Collaboration and Communication:
? Work closely with data analysts, data scientists, and other stakeholders to
understand data requirements and deliver solutions.
? Participate in team meetings, providing updates on progress and contributing to
project discussions.
? Documentation and Reporting:
? Document data engineering processes, workflows, and systems to ensure clarity
and maintainability.
? Prepare reports and presentations to communicate data insights and project
status to the team and management.
Qualifications and Working Experience:
1. Bachelor's degree in computer science, Information Technology, Data Science, or
a related field.
2. 5 - 7 years of experience in data engineering, database management, or a
related role.
Technical Skills:
1. Proficiency in SQL and experience with relational databases (e.g., MySQL,
PostgreSQL, SQL Server).
2. Familiarity with NoSQL databases (e.g., MongoDB, Cassandra) is a plus.
3. Knowledge of programming languages such as Python Java, or Scala is must.
4. Understanding of data warehousing concepts and ETL processes. ML Basics
and exposure is desirable
5. Familiarity with data integration tools and platforms is a plus (e.g., Apache NiFi,
Talend).
6. AWS, Glue and/or Azure Data Factory exposure/experience is a plus/desirable
7. Experience in API, Big Data Technologies is desirable
8. ML Basics and exposure is desirable
Required Qualities:
1. Strong analytical and problem-solving skills.
2. Ability to work effectively both independently and as part of a team.
3. Excellent communication and interpersonal skills.
4. Attention to detail and a commitment to delivering high-quality work
Job Types: Full-time, Permanent
Pay: ₹2,500,000.00 - ₹3,500,000.00 per year
Application Question(s):
How many years of overall experience do you currently have?
What is your notice period in days?
What is your CTC in LPA?
Are you from Tier I college?
Experience:
ETL(Extract, Transform, Load): 2 years (Required)
Data Engineering: 5 years (Required)
SQL: 3 years (Required)
Python: 3 years (Required)
Data Modeling: 3 years (Preferred)
Azure: 3 years (Preferred)
AWS: 3 years (Preferred)
Apache Scala: 2 years (Preferred)
Product development : 1 year (Required)
healthcare domain: 1 year (Preferred)
Work Location: In person
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.