Career Mobility and Development: When you join FedEx, you're joining a team with possibilities that literally span the world--from opportunities for advancement and location transfer, to training and leadership programs.
Total Compensation and Benefits Package: We want to keep our employees for a long time, so we offer competitive benefits, opportunities for flexible work arrangements, and programs to support well-being.
Equal Opportunities Our greatest asset at FedEx is our people. We are committed to building a diverse, equitable and inclusive workforce, and offer equal opportunities, fairness and respect to all regardless of who you are. We encourage you to apply even if you feel your experience does not align with all the aspects in the job description as you could be exactly who we need for this or another opportunity. We do not tolerate discrimination or harassment based on race, color, ethnicity, national origin, religion, sex, age, genetic information, citizenship, disability, marital status, pregnancy, sexual orientation, gender identity, gender expression, veteran status or any other characteristic protected under national, state or local laws. We will reasonably accommodate team members and third parties with physical and mental disabilities. " Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date" Company: INT FedEx Express Transportation and Supply Chain Services (India) Pvt. Ltd. City: Mumbai Scheduled Weekly Hours: 48 Worker Type: Regular Posting Start Date 16-Dec-2025 Posting Close Date: 30-Dec-2025 Job Family: FXE-MEISA: Data Engineer Position Summary: Grade - T7 Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date What your main responsibilities are: Data Architecture & Engineering Architect, design, and implement robust and scalable data pipelines and data lake solutions across diverse data sources using Azure Data Factory, Databricks, Azure Synapse, and other Azure services. ETL/ELT Development & Optimization Build, automate, and optimize ETL/ELT processes for data ingestion, cleansing, transformation, and storage from RDBMS (Oracle, Teradata, SQL Server) and other structured/unstructured systems. Ensure efficient data flow and accessibility for analytics and machine learning workloads. Cloud Data Solutions Lead the deployment and management of cloud-native data platforms leveraging Azure ecosystem tools. Drive adoption of new cloud capabilities that improve performance, scalability, and data democratization. Leadership, Mentorship & Collaboration Lead and inspire a team of data engineers -- providing technical guidance, mentorship, and coaching to help them develop their skills and deliver quality solutions. Foster strong cross-functional collaboration with analysts, data scientists, and business teams to ensure alignment between data initiatives and organisational priorities. Data Modeling & Governance Develop and maintain data models and standards that support consistency, re usability, and governance across teams. Implement data quality frameworks, lineage tracking, and metadata management solutions. Monitoring, Optimization & Automation Oversee monitoring and observability systems for data pipelines. Drive continuous improvement through automation, CI/CD practices, and infrastructure-as-code for data workflows. Presentation & Strategic Communication Deliver compelling presentations and reports to senior leadership and technical stakeholders. Effectively communicate complex technical concepts, architectural designs, and project outcomes through clear storytelling and visualization. Influence data strategy and advocate for innovation and best practices across teams. What we are looking for Education: Bachelor's degree in Computer Science, Information Systems, Engineering, or a quantitative discipline such as Mathematics or Statistics. Master's degree in a relevant field preferred. Experience: Minimum 8+ years of experience in data engineering, data architecture, or big data environments, including experience leading teams or projects. Technical Skills:
Expert in SQL, Python, and PySpark
Strong experience with Azure Data Factory, Databricks, Synapse, Data Lake, Event Hubs, and Logic Apps
Deep understanding of ETL/ELT, data modeling, data warehousing, and data lakehouse architecture
Experience with DevOps for data (CI/CD, Infrastructure as Code)
Familiarity with real-time data streaming (Kafka, Event Hubs)
Knowledge of data quality, metadata management, governance, and security
Exposure to machine learning pipelines, AI integration, and advanced analytics enablement is an advantage
Good to have:
Experience with Delta Lake, Unity Catalog, and data mesh architectures
Working knowledge of Power BI and Tableau for visualization
Familiarity with Azure ML, Airflow, and Kubernetes environments