Hiring for Spark Scala, AWS_KOLKATA&HYDERABAD
Design and develop robust, scalable data pipelines using Apache Spark (Core, SQL, Streaming, MLlib, etc.) with Scala.
Write clean, modular, production-grade Scala code, adhering to coding standards and best practices.
Optimize Spark jobs for performance, reliability, and cost-efficiency, including tuning Spark configurations and job logic.
Handle data ingestion, transformation, and aggregation tasks using Spark RDDs, DataFrames, and Datasets.
Collaborate with stakeholders including data scientists, analysts, and engineers to understand requirements and translate them into technical solutions.
Participate in DevOps/CI/CD workflows, including deployment, monitoring, and maintenance of Spark applications.
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.