Software Engineer Iii (python, Pyspark, Databricks, Aws)

Year    MH, IN, India

Job Description

JOB DESCRIPTION



We have an exciting and rewarding opportunity for you to take your software engineering career to the next level.


As a Databricks Developer at JPMorgan Chase within the Reporting Technology team, you will play a pivotal role in our data transformation journey. You will be responsible for designing, developing, and implementing cloud-based solutions to replace existing vendor software, supporting the daily needs of Finance users across different APAC locations.

Job responsibilities




Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, opportunity, inclusion, and respect



Required qualifications, capabilities, and skills




Formal training or certification on software engineering concepts and 3+ years applied experience Hands-on experience in programming languages such as Python and Big Data technologies like Spark, Kafka. Strong expertise in Databricks and AWS Cloud. Proven experience in SQL database management and development. Proficiency in software development processes, including DBX framework and Agile methodologies. Experience with data integration, ETL processes, and data warehousing. Excellent problem-solving skills and attention to detail.



Preferred qualifications, capabilities, and skills




AWS, Google, or Azure certification. Databricks Certified Data Engineer or equivalent. Knowledge of DevOps practices and CI/CD pipelines, containerization - Docker, Kubernetes, etc.



ABOUT US

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD4542574
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    MH, IN, India
  • Education
    Not mentioned
  • Experience
    Year