Develop custom software solutions to design, code, and enhance components across systems or applications. Use modern frameworks and agile practices to deliver scalable, high-performing solutions tailored to specific business needs.
Must have skills :
Databricks Unified Data Analytics Platform
Good to have skills :
Apache Airflow, PySpark
Minimum
12
year(s) of experience is required
Educational Qualification :
15 years full time education
Summary: Seeking a forward-thinking professional with an AI-first mindset to design, develop, and deploy enterprise-grade solutions using Generative and Agentic AI frameworks that drive innovation, efficiency, and business transformation. As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your typical day will involve collaborating with team members to ensure the successful implementation of enhancements and maintenance tasks, while also focusing on the development of new features to meet client needs. You will be responsible for troubleshooting issues and optimizing application performance, ensuring that the solutions you provide are efficient and effective. Roles & Responsibilities: Lead AI-driven solution design and delivery by applying GenAI and Agentic AI to address complex business challenges, automate processes, and integrate intelligent insights into enterprise workflows for measurable impact. Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Expected to provide solutions to problems that apply across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Mentor junior team members to foster their professional growth. Professional & Technical Skills: Strong grasp of Generative and Agentic AI, prompt engineering, and AI evaluation frameworks. Ability to align AI capabilities with business objectives while ensuring scalability, responsible use, and tangible value realization. Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Apache Airflow, PySpark. - Strong understanding of data processing and analytics workflows. - Experience in developing and deploying scalable data solutions. - Proficient in coding and debugging in relevant programming languages. Additional Information:- The candidate should have minimum 12 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Bengaluru office. - A 15 years full time education is required."
15 years full time education
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.