Designs, builds, tests, and deploys cloud application solutions that integrate cloud and non-cloud infrastructure. Can deploy infrastructure and platform environments, creates a proof of architecture to test architecture viability, security and performance.
Must have skills :
Microsoft Fabric
Good to have skills :
NA
Minimum
5
year(s) of experience is required
Educational Qualification :
15 years full time education
We are seeking a skilled Microsoft Fabric Data Engineer to design, build, optimize, and maintain modern data solutions using Microsoft Fabric. The ideal candidate will have strong experience with data engineering, analytics workloads, cloud-based data platforms, and end-to-end data pipeline development. Min 6yrs exp - Microsoft Fabric Data Engineer Key Responsibilities 1. Data Architecture & Modeling o Design and implement scalable data architectures using Microsoft Fabric components such as Lakehouse, Data Warehouse, OneLake, and KQL Databases. o Create and optimize star schemas, data marts, semantic models, and medallion architectures. o Manage and enforce data governance, security, and access control within Fabric workspaces. 2. ETL/ELT Pipeline Development o Develop, orchestrate, and maintain data ingestion and transformation pipelines using Data Factory, Fabric Pipelines, and Dataflows Gen2. o Build automated workflows for batch, streaming, or event-driven ingestion. o Optimize pipeline performance and ensure reliability, scalability, and fault-tolerance. 3. Data Integration & Processing o Work with structured and unstructured data from various enterprise systems, APIs, and external sources. o Utilize Apache Spark within Fabric Notebooks for large-scale data processing. o Implement Delta Lake best practices (Z-ordering, OPTIMIZE, VACUUM, etc.). 4. Analytics & Reporting Enablement o Partner with BI analysts to create and optimize Power BI semantic models and direct lake mode datasets. o Publish high-quality, certified data assets for business consumption. o Ensure data quality, accuracy, and consistency across analytic layers. 5. Monitoring, Optimization & Operations o Monitor Fabric workloads, storage utilization, capacity models, and performance. o Implement logging, alerting, and automated testing for pipelines. o Perform cost optimization for compute workloads and OneLake storage. 6. Collaboration & Stakeholder Engagement o Work closely with data analysts, data scientists, and business stakeholders to understand data needs. o Translate business requirements into scalable data solutions. o Document workflows, architectures, and best practices. ________________________________________ Required Skills & Qualifications o Bachelor's degree in Computer Science, Information Systems, Engineering, or related field. o Hands-on experience with Microsoft Fabric (Lakehouse, Data Factory, Pipelines, OneLake, Notebooks, Power BI). o Strong proficiency with SQL, Python, Spark, and Delta Lake. o Experience with Azure services (Azure Data Lake, Azure Synapse, Azure Data Factory, AAD). o Solid understanding of ETL/ELT methodologies, data modeling, and data warehousing concepts. o Knowledge of version control (Git) and CI/CD workflows. o Excellent analytical, problem-solving, and communication skills. ________________________________________ Preferred Qualifications o Fabric Analyst or Fabric Engineer Certification. o Experience with MLOps or DataOps practices. o Familiarity with DevOps tools (Azure DevOps, GitHub Actions). o Experience with streaming technologies (Event Hubs, Kafka, Fabric Real-Time Analytics).
15 years full time education
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.