Location :
LTIM office PAN India Preferred Hyderabad
No of Vacancies :
1
Years of Experience :
10-12 years
Any Project specific Prerequisite skills (Must have)
Primary - REST API, Azure Data factory (ADF), Azure Databricks (ADB), ADLS, SQL, Azure SQL, SQL Server, PySpark, Python, SAFe/Scrum agile
Secondary - Azure DevOps (CI/CD) , Spark, Scala
Detailed JD
Developing and implementing data pipelines to extract transform and load ETL data from various sources into Azure data services This includes using Azure Data Factory Azure Databricks or other tools to orchestrate data workflows and data movement
Build Test and Run of data assets tied to tasks and user stories from the Azure DevOps instance of Enterprise Data Analytics
Bring a level of technical expertise of the Big Data space that contributes to the strategic roadmaps for Enterprise Data Architecture Global Data Cloud Architecture and Global Business Intelligence Architecture as well contributes to the development of the broader Enterprise Data Analytics Engineering community
Actively participate in regularly scheduled contact calls to transparently review the status of inflight projects priorities of backlog projects and review adoption of previous deliveries from Enterprise Data Analytics with the Enterprise Data Analytics management team
Handle break fixes and participate in a rotational oncall schedule Oncall includes monitoring of scheduled jobs and ETL pipelines
Actively participate in team meetings to transparently review the status of inflight projects and their progress
Follow standard practice and frameworks on each project from development to testing and then productionizing each within the appropriate environment laid out by Data Architecture
Challenges self and others to make an impact that matters and help team connect their contributions with broader purpose
Sets expectations to the team aligns the work based on the strengths and competencies and challenges them to raise the bar while providing the support
Extensive knowledge of multiple technologies tools and processes to improve the design and architecture of the assigned applications
Design implement and manage CICD pipelines for software development teams
Understanding of web applications and CICD processes
Knowledge Sharing Documentation
Contribute to produce and maintain processes procedures operational and architectural documentation
Change Control ensure compliance with Processes and adherence to standards and documentation
Work with Deloitte Technology leadership and service teams in reviewing documentation and aligning KPIs to critical steps in our service operations
Active participation in ongoing training within BI space
Expectations:
Ingest the data into target from source using ADF or ADB.
Apply transformations if any based on the business requirements
The candidate should have worked on end-to-end ETL cycle using ADF or ADB
Source could be API and Database Rest API 90 Oracle etc
Target Azure SQL
Quality check to be done post data transfer
Logic apps for share point, key vault event triggers are used - Good to have
The candidate will have to work in IC role
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.