Design, build, and maintain scalable ETL/ELT pipelines using Python, PySpark, and Databricks to ingest, transform, and load data from various sources into our data warehouse.
Develop and optimize complex SQL queries for data extraction, transformation, and analysis within Snowflake.
Implement and manage real-time data streaming solutions using Kafka to support immediate data processing needs.
Collaborate with data scientists, analysts, and other engineering teams to understand data requirements and deliver solutions that meet business needs.
Ensure data quality, integrity, and security across all data platforms and pipelines.
Monitor data pipeline performance, troubleshoot issues, and implement solutions for optimization and error resolution.
Develop and maintain documentation for data models, data flows, and ETL processes.
Participate in code reviews, contribute to architectural discussions, and promote best practices in data engineering.
Stay current with emerging data technologies and trends, and recommend their adoption where appropriate.
Required Qualifications:
Bachelor's degree in Computer Science, Engineering, Information Systems, or a related quantitative field.
Minimum of 5 years of professional experience in data engineering or a similar role.
Strong proficiency in Python
for data manipulation, scripting, and automation.
Extensive experience with SQL
for complex data querying, analysis, and database management.
Demonstrated experience with PySpark
for big data processing and distributed computing.
Hands-on experience with Snowflake
as a cloud data warehouse, including schema design, performance tuning, and data loading.
Proven experience with Databricks
for data engineering workflows, notebook development, and cluster management.
Experience with Kafka
for building real-time data streaming applications.
Solid understanding of data warehousing concepts, data modeling (dimensional and normalized), and ETL/ELT principles.
Excellent problem-solving skills and attention to detail.
Strong communication and collaboration skills, with the ability to work effectively in a team environment.
Send your updated cv to careers@giantmindsolutions.com
Job Type: Full-time
Pay: ?800,000.00 - ?2,000,000.00 per year
Benefits:
Work from home
Schedule:
Monday to Friday
Experience:
total work: 5 years (Required)
Python, Pyspark, SQL, Snowflake, databricks, kafka: 5 years (Required)
Work Location: Remote
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.
Job Detail
Job Id
JD3747507
Industry
Not mentioned
Total Positions
1
Job Type:
Contract
Salary:
Not mentioned
Employment Status
Permanent
Job Location
Remote, IN, India
Education
Not mentioned
Experience
Year
Apply For This Job
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.