(ETL, SQL, Java/Scala/PySpark, GCP)
Key Responsibilities:ETL Development: Design, develop, and maintain robust ETL pipelines to process large volumes of data efficiently.Database Management: Utilize SQL to manage and query relational databases, ensuring data integrity and performance.Big Data Processing: Leverage Java, Scala, or PySpark to process and analyze large datasets in distributed environments.Cloud Infrastructure: Implement and manage data solutions on Google Cloud Platform (GCP), utilizing services like BigQuery, Dataflow, and Pub/Sub.Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver scalable solutions.Mandatory Skills:ETL Tools: Experience with ETL frameworks and tools.SQL: Strong proficiency in SQL for data manipulation and querying.Programming Languages: Proficiency in Java, Scala, or PySpark for data processing.Cloud Platforms: Hands-on experience with GCP services, including BigQuery, Dataflow, and Pub/Sub.Data Warehousing: Knowledge of data warehousing concepts and architectures.Preferred Qualifications:Bachelor's or Master's degree in Computer Science, Engineering, or a related field.Experience with additional cloud platforms (AWS, Azure) is a plus.Familiarity with containerization technologies like Docker and Kubernetes.Strong problem-solving skills and the ability to work in a fast-paced environment.
Job Type: Full-time
Pay: ₹559,394.54 - ?1,339,970.88 per year
Benefits:
Health insurance
Provident Fund
Schedule:
Day shift
Fixed shift
Monday to Friday
Morning shift
Work Location: In person
Application Deadline: 28/06/2025
Expected Start Date: 01/07/2025
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.