to design, build, and maintain scalable data pipelines and data platforms. The ideal candidate should have strong technical skills in data processing, databases, and cloud technologies, and be eager to work with large datasets to support analytics and business intelligence needs.
Key Responsibilities:
Design, develop, and maintain
ETL/ELT pipelines
for structured and unstructured data
Build and optimize data workflows for data ingestion, transformation, and storage
Work with large-scale datasets from multiple sources
Ensure data quality, integrity, and reliability
Collaborate with data analysts, data scientists, and product teams
Optimize database performance and query efficiency
Support data warehouse and data lake solutions
Monitor and troubleshoot data pipeline issues
Document data architecture, pipelines, and processes
Required Skills & Qualifications
Bachelor's degree in Computer Science, Engineering, IT, or a related field
1-3 years of experience in
Data Engineering or related roles
Strong knowledge of
SQL
and relational databases
Experience with
Python, Java, or Scala
for data processing
Familiarity with
ETL tools
and data integration frameworks
Experience with
data warehouses
(Snowflake, Redshift, BigQuery, etc.)
Basic understanding of
cloud platforms
(AWS, Azure, or GCP)
Knowledge of data modeling concepts (star/snowflake schema)
Preferred Skills (Good to Have):
Experience with
Apache Spark, Hadoop, Kafka, or Airflow
Exposure to
Big Data technologies
Experience with
NoSQL databases
(MongoDB, Cassandra, DynamoDB)
Familiarity with
Docker and Kubernetes
Understanding of data security, governance, and compliance
Cloud data certifications or relevant training
Job Type: Full-time
Pay: ₹200,000.00 - ₹300,000.00 per year
Work Location: In person
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.