Job Description

Job Summary




We are seeking a AWS Databricks with 6 to 10 years of experience to join our team. The ideal candidate will have expertise in Spark in Scala Delta Sharing and Databricks Unity Catalog . This role involves working with cutting-edge technologies like Databricks CLI Delta Live Pipelines and Structured Streaming. The candidate will play a crucial role in managing risk and ensuring data integrity using tools such as Apache Airflow Amazon S3 and Python. The position is hybrid with no travel



Responsibilities



Develop and maintain scalable data pipelines using Spark in Scala to ensure efficient data processing and analysis. Implement Delta Sharing and Databricks Unity Catalog to manage and secure data access across the organization. Utilize Databricks CLI and Delta Live Pipelines to automate data workflows and improve operational efficiency. Design and execute Structured Streaming processes to handle real-time data ingestion and processing. Apply risk management strategies to identify and mitigate potential data-related risks. Integrate Apache Airflow for orchestrating complex data workflows and ensuring seamless data operations. Leverage Amazon S3 for data storage solutions ensuring high availability and durability of data assets. Utilize Python for scripting and automation tasks to enhance productivity and streamline processes. Develop and optimize Databricks SQL queries to extract meaningful insights from large datasets. Implement Databricks Delta Lake to ensure data reliability and consistency across various data sources. Manage Databricks Workflows to automate and schedule data tasks improving overall data management efficiency. Collaborate with cross-functional teams to ensure alignment on data strategies and objectives. Contribute to the continuous improvement of data practices and methodologies to support the companys mission.

Qualifications



Possess strong expertise in Spark in Scala and Databricks technologies. Demonstrate proficiency in Delta Sharing and Databricks Unity Catalog . Have experience with Databricks CLI and Delta Live Pipelines. Show capability in Structured Streaming and risk management. Be skilled in Apache Airflow and Amazon S3. Have a strong command of Python for data-related tasks. Be familiar with Databricks SQL and Delta Lake. Understand Databricks Workflows and their applications. Exhibit problem-solving skills and attention to detail. Be able to work in a hybrid model with a focus on day shifts. Have excellent communication and collaboration skills. Be committed to continuous learning and professional development. Be adaptable to changing technologies and business needs.

Certifications Required




Databricks Certified Associate Developer for Apache Spark

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD4059257
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    KA, IN, India
  • Education
    Not mentioned
  • Experience
    Year