About The Company:
ARA's client is a leading IT solutions provider, offering Applications, Business Process Outsourcing (BPO) and Infrastructure services globally through a combination of technology know how, domain and process expertise. Over the years we have left an indelible impression in the IT solutions domain with an impressive clientele and an extensive global presence. Our integrated solutions offering is aimed at creating value for our customers, helping them in improving their business processes with minimum hassles and capital outlays.
The Role:
We are looking for an experienced Databricks Developer to join our team. The ideal candidate will have expertise in Python, PySpark, SQL, and cloud platform such as AWS. The role involves designing, developing, and optimizing data pipelines on Databricks, ensuring efficient data processing, transformation, and analytics
Key Responsibilities:
Design and develop ETL/data pipelines using Databricks and Apache Spark.
Optimize and manage Spark-based workloads for scalability and performance.
Work with structured and unstructured data from multiple sources.
Implement Delta Lake for data reliability and ACID transactions and Unity Catalog for centralized governance and cataloging service in Databricks.
Develop and maintain SQL-based transformations, queries, and performance tuning.
Collaborate with data engineers, analysts, and business teams to meet data requirements.
Implement job orchestration using Airflow, Databricks Workflows, or other scheduling tools.
Ensure data security, governance, and compliance best practices.
Monitor, debug, and resolve performance bottlenecks in Databricks jobs.
Work with cloud storage solutions (AWS S3).
Skills Required:
Strong experience in Databricks and AWS.
Hands-on experience with Python, PySpark, Scala, SQL.
Experience with Spark performance tuning and optimization.
Knowledge of Delta Lake, Lakehouse Architecture, and Medallion Architecture.
Familiarity with orchestration tools (Airflow, Databricks Workflows).
Hands-on experience with data modeling and transformation techniques.
Experience with cloud platform AWS.
Proficiency in CI/CD for Databricks using Git, DevOps tools.
Strong understanding of data security, governance, and access control in Databricks.
Good knowledge of APIs, REST services, and integrating Databricks with external systems.
Qualifications & Experience:
Bachelor's or Master's degree in Computer Science, Information Systems, Engineering or equivalent.
4 to 6 years of experience
Preferred Qualifications:
Databricks Certification (Databricks Certified Associate/Professional).
Knowledge of BI tools like Power BI, Looker, ThoughtSpot.
Required Skills Databricks, AWS, python, pyspark, SQL, ETL/data pipelines, Delta Lake, Databricks Workflows
Educational Qualification Bachelor's or Master's degree in Computer Science, Information Systems, Engineering or equivalent.
Apr 03, 2025
Pay Rate
1600000 - 1700000 INR Per Annum
4 - 8 Years Exp.
1 Vacancy
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.