who combines deep expertise in AI/ML with a strong focus on data quality and advanced analytics. This role requires a proven track record in developing production-grade machine learning solutions, implementing robust data quality frameworks, and leveraging cutting-edge analytical tools to drive business transformation through data-driven insights.
=================================================================================================================================================================================================================================================================================================================================================================================================================
Work you will do
====================
The Data Scientist will be responsible for developing and implementing end-to-end AI/ML solutions while ensuring data quality excellence across all stages of the data lifecycle. This role requires extensive experience in modern data science platforms, AI frameworks, and analytical tools, with a focus on scalable and production-ready implementations.
==================================================================================================================================================================================================================================================================================================================================================================
Project Leadership and Management:
Lead complex data science initiatives utilizing Databricks, Dataiku, and modern AI/ML frameworks for end-to-end solution development
Establish and maintain data quality frameworks and metrics across all stages of model development
Design and implement data validation pipelines and quality control mechanisms for both structured and unstructured data
Strategic Development:
Develop and deploy advanced machine learning models, including deep learning and generative AI solutions
Design and implement automated data quality monitoring systems and anomaly detection frameworks
Create and maintain MLOps pipelines for model deployment, monitoring, and maintenance
Team Mentoring and Development:
Lead and mentor a team of data scientists and analysts, fostering a culture of technical excellence and continuous learning
Develop and implement training programs to enhance team capabilities in emerging technologies and methodologies
Establish performance metrics and career development pathways for team members
Drive knowledge sharing initiatives and best practices across the organization
Provide technical guidance and code reviews to ensure high-quality deliverables
Data Quality and Governance:
Establish data quality standards and best practices for data collection, preprocessing, and feature engineering
Implement data validation frameworks and quality checks throughout the ML pipeline
Design and maintain data documentation systems and metadata management processes
Lead initiatives for data quality improvement and standardization across projects
Technical Implementation:
Design, develop and deploy end-to-end AI/ML solutions using modern frameworks including TensorFlow, PyTorch, scikit-learn, XGBoost for machine learning, BERT and GPT for NLP, and OpenCV for computer vision applications
Architect and implement robust data processing pipelines leveraging enterprise platforms like Databricks, Apache Spark, Pandas for data transformation, Dataiku and Apache Airflow for ETL/ELT processes, and DVC for data version control
Establish and maintain production-grade MLOps practices including model deployment, monitoring, A/B testing, and continuous integration/deployment pipelines
Technical Expertise Requirements:
Must Have:
Enterprise AI/ML Platforms: Demonstrate mastery of Databricks for large-scale processing, with proven ability to architect solutions at scale
Programming & Analysis: Advanced Python (NumPy, Pandas, scikit-learn), SQL, PySpark with production-level expertise
Machine Learning: Deep expertise in TensorFlow or PyTorch, and scikit-learn with proven implementation experience
Big Data Technologies: Advanced knowledge of Apache Spark, Databricks, and distributed computing architectures
Cloud Platforms: Strong experience with at least one major cloud platform (AWS/Azure/GCP) and their ML services (SageMaker/Azure ML/Vertex AI)
Data Processing & Analytics: Extensive experience with enterprise-grade data processing tools and ETL pipelines
MLOps & Infrastructure: Proven experience in model deployment, monitoring, and maintaining production ML systems
Data Quality: Experience implementing comprehensive data quality frameworks and validation systems
Version Control & Collaboration: Strong proficiency with Git, JIRA, and collaborative development practices
Database Systems: Expert-level knowledge of both SQL and NoSQL databases for large-scale data management
Visualization Tools: Tableau, Power BI, Plotly, Seaborn
Large Language Models: Experience with GPT, BERT, LLaMA, and fine-tuning methodologies
Good to Have:
Additional Programming: R, Julia
Additional Big Data: Hadoop, Hive, Apache Kafka
Multi-Cloud: Experience across AWS, Azure, and GCP platforms
Advanced Analytics: Dataiku, H2O.ai
Additional MLOps: MLflow, Kubeflow, DVC (Data Version Control)
Data Quality & Validation: Great Expectations, Deequ, Apache Griffin
Business Intelligence: SAP HANA, SAP Business Objects, SAP BW
Specialized Databases: Cassandra, MongoDB, Neo4j
Container Orchestration: Kubernetes, Docker
Additional Collaboration Tools: Confluence, BitBucket
Education:
Advanced degree in quantitative discipline (Statistics, Math, Computer Science, Engineering) or relevant experience.
Qualifications:
10-13 years of experience with data mining, statistical modeling tools and underlying algorithms.
5+ years of experience with data analysis software for large scale analysis of structured and unstructured data.
Proven track record of leading and delivering large-scale machine learning projects, including production model deployment, data quality framework implementation and experience with very large datasets to create data-driven insights thru predictive and prescriptive analytic models.
Extensive knowledge of supervised and unsupervised analytic modeling techniques such as linear and logistic regression, support vector machines, decision trees / random forests, Naive-Bayesian, neural networks, association rules, text mining, and k-nearest neighbors among other clustering models.
Extensive experience with deep learning frameworks, automated ML platforms, data processing tools (Databricks Delta Lake, Apache Spark), analytics platforms (Tableau, Power BI), and major cloud providers (AWS, Azure, GCP)
Experience architecting and implementing enterprise-grade solutions using cloud-native ML services while ensuring cost optimization and performance efficiency
Strong track record of team leadership, stakeholder management, and driving technical excellence across multiple concurrent projects
Expert-level proficiency in Python, R, and SQL, with deep understanding of statistical analysis, hypothesis testing, feature engineering, model evaluation, and validation techniques in production environments
Demonstrated leadership experience in implementing MLOps practices, including model monitoring, A/B testing frameworks, and maintaining production ML systems at scale.
Working knowledge of supervised and unsupervised learning techniques, such as Regression/Generalized Linear Models, decision tree analysis, boosting and bagging, Principal Components Analysis, and clustering methods.
Strong oral and written communication skills, including presentation skills
The Team
============
Information Technology Services (ITS) helps power Deloitte's success. ITS drives Deloitte, which serves many of the world's largest, most respected organizations. We develop and deploy cutting-edge internal and go-to-market solutions that help Deloitte operate effectively and lead in the market. Our reputation is built on a tradition of delivering with excellence.
The ~3,000 professionals in ITS deliver services including:
Product Engineering (PxE) team is the internal software and applications development team responsible for delivering leading-edge technologies to Deloitte professionals. Their broad portfolio includes web and mobile productivity tools that empower our people to log expenses, enter timesheets, book travel and more, anywhere, anytime. PxE enables our client service professionals through a comprehensive suite of applications across the business lines. In addition to application delivery, PxE offers full-scale design services, a robust mobile portfolio, cutting-edge analytics, and innovative custom development.
=========================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================
Work Location: Hyderabad
Recruiting tips
From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters.
Benefits
At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you.
Our people and culture
Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work.
Our purpose
Deloitte's purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities.
Professional development
From entry-level employees to senior leaders, we believe there's always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career.
Requisition code: 303069
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.
Job Detail
Job Id
JD3759153
Industry
Not mentioned
Total Positions
1
Job Type:
Full Time
Salary:
Not mentioned
Employment Status
Permanent
Job Location
TS, IN, India
Education
Not mentioned
Experience
Year
Apply For This Job
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.