Mandatory skills: (Azure SQL, Python, PySpark, Databricks, Azure, Kafka etc) and candidate should be well versed with Agile methodologies and CI/CD deployment models.
Mode of Interview: Virtual || 2 rounds
Location: HYD -Hybrid
CTC 40 LPA
We are seeking Data Architect with 10+ years of experience to support and enhance our enterprise data warehouse platforms. This role is critical in delivering high-quality master data solutions across the organization, utilizing modern tools like Databricks and Azure to drive insights and ensure data reliability. The ideal candidate will have strong SQL, data profiling, and experience working with cross-functional teams in a semi conductor environment. Candidate must have architect experience on technologies like (Azure SQL, Python, PySpark, Databricks, Azure, Kafka etc) and candidate should be well versed with Agile methodologies and CI/CD deployment models
Must-Have Skills:
Data Architecture Design:
Design and implement scalable, secure, and high-performance data architectures to support diverse business needs.
Develop and maintain data models, data dictionaries, and metadata repositories.
Define and enforce data standards, policies, and procedures.
Data Strategy & Governance:
Collaborate with stakeholders to maintain data strategies and roadmaps.
Establish and maintain data governance frameworks to ensure data quality and compliance. Implement re-usable data quality frameworks.
Evaluate and recommend new data technologies and tools.
Change how we think, act, and utilize our data by performing exploratory and quantitative analytics, data mining, and discovery.
Ensure data security and privacy through appropriate access controls and encryption.
Data Integration & Management:
Design ETL/ELT processes for data integration from various sources.
Build software across our data platform, including event driven data processing, storage, and serving through scalable and highly available APIs, with awesome cutting-edge technologies.
Optimize data storage and retrieval for efficient data access.
Work closely with data analysts and business stake holders to make data easily accessible and understandable to them.
Cloud Data Solutions:
Design and implement cloud-based data solutions using platforms like AWS, Azure, Snowflake and Databricks.
Optimize cloud data storage and processing for cost-effectiveness and performance.
Stay up to date with the latest cloud data technologies and trends.
Develop and enforce data engineering, security, data quality standards through automation.
Performance & Optimization:
Monitor and optimize data system performance.
Troubleshoot and resolve data-related issues.
Conduct performance tuning and capacity planning.
Collaboration & Communication:
Help us stay ahead of the curve by working closely with data engineers, stream processing specialists, API developers, our DevOps team, and analysts to design systems which can scale elastically.
Work closely with business analysts & business users to understand data requirements.
Communicate complex technical concepts to non-technical stakeholders.
Provide technical leadership and mentorship to junior team members.
Think of new ways to help make our data platform more scalable, resilient and reliable and then work across our team to put your ideas into action.
Help build and maintain foundational data products such as but not limited to Finance, Titles, Content Sales, Theatrical, Consumer Products etc.
Work closely with various other data engineering teams to roll out new capabilities.
Build process and tools to maintain Machine Learning pipelines in production.
Professional Certifications:
Any Architect certification
Any cloud certification (AWS or AZURE or GCP)
Job Types: Full-time, Permanent
Pay: ₹3,000,000.00 - ₹4,000,000.00 per year
Benefits:
Health insurance
Provident Fund
Work Location: In person
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.