Associate Principal Architect (etl + Azure Data)

Year    KA, IN, India

Job Description

Posted on: 07-01-2025



Experience: 10+ years



India-Bengaluru



Job ID #1002177





Role Requirements



We are seeking a highly skilled Data Engineer with experience in Azure Data Factory (ADF) and API-based data integration to serve as a subject matter expert within the Business Intelligence and Data Engineering team. This role will play a key part in modernizing integrations between the Applied EPIC agency management system and Salesforce CRM. The position involves replacing existing Informatica workflows with Microsoft-native services across Azure. The ideal candidate has hands-on experience designing and implementing cloud-based data pipelines. Strong experience working with REST and SOAP APIs, SDKs, and end-to-end data flow orchestration within Azure is required. The role demands deep technical expertise in Azure integration services, strong SQL skills, SOAP XML and SharePoint API knowledge, and XML and JSON manipulation using SQL. A collaborative mindset, experience leading delivery teams, and a disciplined approach to data quality, governance, and operational monitoring are essential. Key Responsibilities :Design and build modern data integration pipelines in Azure Data Factory to replace legacy Informatica workflows between EPIC AMS and Salesforce. Develop and maintain Azure Functions using .NET or Python to interface with EPIC SDK APIs for reading, writing, and bulk updating policy, client, and transaction data. Implement Logic Apps workflows to orchestrate near real-time integrations, data refreshes, and error-handling processes. Configure and manage Azure API Management (APIM) to enable secure and scalable API calls to EPIC and Salesforce endpoints. Design robust data flow orchestration patterns using ADF pipelines, triggers, and linked services across OneLake and ADLS Gen2. Implement monitoring and logging using Application Insights, Log Analytics, and Azure Monitor to track pipeline health and performance. Collaborate closely with Salesforce, BI, and Data Architecture teams to align data models and ensure consistent schema mappings across systems. Support data transformation and validation using Mapping Data Flows, Synapse Notebooks, or Fabric Dataflows as required. Build reusable frameworks for error handling, retries, and dead-letter queues using Service Bus and Event Grid. Enforce data governance and compliance requirements (PII, audit trails, IPE) through secure credential management in Azure Key Vault and structured logging. Contribute to CI/CD pipelines using Azure DevOps or GitHub Actions for version control, testing, and deployment automation. Required Skills and Experience :5+ years of experience as a Data Engineer or Integration Developer, with at least 3 years of hands-on experience in Azure Data Factory. Proven experience integrating on-premises or SaaS systems using APIs or SDKs, preferably with Applied EPIC, Salesforce, or similar CRM/AMS platforms. Proficiency in ADF pipeline orchestration, including Copy Activity, REST connectors, Web Activities, and Custom Activities. Strong programming skills in C# (.NET) or Python for API development and automation. Hands-on experience with Logic Apps, Service Bus, Event Grid, and Azure Functions. Working knowledge of Azure API Management, Azure Key Vault, Application Insights, and Azure Monitor. Solid understanding of data modeling, ETL/ELT patterns, and data quality frameworks within the Azure ecosystem. Familiarity with Power BI, Microsoft Fabric Lakehouse, or Synapse Analytics is a plus. Excellent problem-solving, documentation, and communication skills with a collaborative, delivery-focused approach. Preferred Qualifications :Experience working with Applied EPIC SDKs or agency management system APIs. Exposure to Salesforce Bulk API, SOQL, and the Salesforce data model. Understanding of insurance brokerage, financial services, or CRM data domains. Azure certifications such as DP-203, DP-500, or PL-300. Experience with CI/CD pipelines using Azure DevOps or GitHub Actions. Key Outcomes :Successful replacement of Informatica pipelines with Azure-native orchestration using Azure Data Factory and Logic Apps. Reliable, auditable, and automated integration between EPIC and Salesforce data platforms. Scalable and maintainable API-driven data architecture aligned with Microsoft Fabric strategy. * Improved monitoring, performance, and data quality through centralized observability tools.

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD5071942
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    KA, IN, India
  • Education
    Not mentioned
  • Experience
    Year