We are seeking a highly skilled and experienced Lead Data Engineer to spearhead the design, development, and optimization of scalable data infrastructure. As a key member of our data team, you will be responsible for architecting data pipelines, leading data platform initiatives, and mentoring a team of data engineers to enable data-driven decision-making across the organization.
Key Skills
:
SQL, ETL Tools, Reporting Tools
Key Requirement :
? The day-to-day development activities will need knowledge of the below concepts.
? Expert-level knowledge in RDBMS (SQL Server) with clear understanding of SQL query writing, object creation and management and performance and optimisation of DB/DWH operations.
? Good understanding of Transactional and Dimensional Data Modelling, Star Schema, Facts/Dimensions, Relationships
? Good understanding of ETL concepts and exposure to at least one ETL service providing tool. (SSIS/ADF/Databricks/Airflow)
? Expert-level knowledge in at least one MS reporting/visualisation tool (Power BI/Azure Analysis Services) ? In-depth expertise in Azure Data Factory and Databricks, including building scalable data pipelines, orchestrating complex workflows, implementing dynamic and parameterized pipelines, and optimizing Sparkbased data transformations for large-scale integrations.
? Hands-on experience with Databricks Unity Catalog for centralized data governance, fine-grained access control, auditing, and managing data assets securely across multiple workspaces.
? Should have worked on at least 1 development lifecycle of one of the below:
? End-to-end ETL project (Involving any ETL tool)
? End-to-end reporting project (Involving a reporting too, Power BI & Analysis Services preferred)
? Ability to write and review test cases, test code and validate code.
? Ability to perform data analysis on different reports to come up with troubleshooting of missing data, suggest value added metrics and consult on best practices to the customer
? Good understanding of SDLC practices like source control, version management, usage of Azure Devops and CI/CD practices.
.
Project context:
? Should have the skill to fully understand the context and use-case of a project and have a personal vision for it -Play the role of interfacing with customer directly on a daily basis.
? Should be able to converse with functional users and convert requirements into tangible processes/models and documentation in available templates.
? Should be able to provide consultative options to customer on best way to execute projects
? Should have a good understanding of project dynamics - scoping, setting estimates, setting timelines, working around timelines in case of exceptions, etc.
? Should be able to demonstrate the ability to technically lead a team of developers and testers and perform design reviews, code reviews, etc.
? Should have a good understanding of presentation and communications skills - written or verbal, specially to express technical ideas / solutions.
Preferred skills:
? Knowledge of Python is a bonus
? Knowledge of Azure Devops, Source Control/Repos is good to have
FOR IMMEDIATE RESPONSE PLEASE SEND YOUR UPDATED CV TO
arun@qapsoftware.com
Job Type: Full-time
Benefits:
Provident Fund
Application Question(s):
How many years of Total IT experience you are having ?
How many years of experience do you have working with SSIS /SSAS ?
How many years of experience do you have working with Power BI dashboards connected to SSAS Tabular models?
How many years of experience you are having with CI/CD in SQL server ?
Work Location: In person
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.
Job Detail
Job Id
JD3703440
Industry
Not mentioned
Total Positions
1
Job Type:
Contract
Salary:
Not mentioned
Employment Status
Permanent
Job Location
TN, IN, India
Education
Not mentioned
Experience
Year
Apply For This Job
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.