Databricks Engineer

Year    Remote, IN, India

Job Description

Job Title:

Databricks Engineer (Contract - Remote)

Job Type:

Contract (40 hours/week, long-term engagement)

Location:

Remote

About the Role:



We are looking for a highly skilled

Databricks Engineer

with strong expertise in

Python, PySpark, SQL/PLSQL, and advanced data engineering practices

. The ideal candidate will be hands-on with

Spark and Databricks optimizations

, have proven experience in

troubleshooting large-scale data pipelines

, and demonstrate strong problem-solving skills.

This is a

long-term remote contract role

requiring excellent technical depth and strong communication skills for effective collaboration with cross-functional teams.

Responsibilities:



Design, develop, and maintain

scalable data pipelines

and ETL processes using

Databricks, PySpark, and SQL/PLSQL

. Optimize and tune

Spark jobs

for performance, scalability, and cost-efficiency. Troubleshoot and debug complex

data processing issues, Databricks job failures, and Spark performance bottlenecks

. Work with large-scale structured and unstructured datasets across

cloud platforms (AWS/Azure/GCP)

. Collaborate with data scientists, analysts, and business teams to deliver high-quality, reliable, and timely data solutions. Implement

best practices for coding, testing, monitoring, and CI/CD

in Databricks workflows. Ensure compliance with

data governance, security, and quality standards

. Proactively identify and recommend improvements for

data pipeline efficiency

.

Required Skills & Qualifications:



5+ years

of hands-on experience in

Data Engineering

. Strong proficiency in

Python, PySpark, and SQL/PLSQL

. In-depth knowledge of

Databricks platform

and experience with

job orchestration and monitoring

. Proven expertise in

Spark optimization techniques

(shuffle tuning, partitioning, caching, broadcast joins, AQE, etc.). Strong understanding of

data lake architectures, Delta Lake, and cloud-native data solutions

. Solid experience in

debugging and troubleshooting large-scale distributed systems

. Familiarity with

CI/CD pipelines, Git, DevOps practices

. Excellent

analytical, problem-solving, and communication skills

.

Preferred Qualifications:



Experience with

cloud services

such as

AWS Glue, Azure Data Factory, or GCP Dataflow

. Knowledge of

data modeling, warehousing (Snowflake/Redshift/BigQuery)

. Experience working in

Agile environments

.

Engagement Details:



Contract Role

- 40 hours per week.

Remote work

(long-term engagement). Flexible collaboration with distributed teams across multiple time zones.
Job Types: Full-time, Contractual / Temporary, Freelance
Contract length: 6 months

Pay: ₹80,000.00 - ₹100,000.00 per month

Work Location: Remote

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD4117132
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Remote, IN, India
  • Education
    Not mentioned
  • Experience
    Year