Job Title: Senior Consultant - Data Engineer (Knowledge Graph)
Career Level: D2
Introduction to role
------------------------
Are you ready to disrupt an industry and change lives? We are seeking a dynamic Senior Consultant - Data Engineer with expertise in knowledge graph concepts to join our team. Your work will have a direct impact on our ability to develop life-changing medicines, empowering the business to perform at its peak. Dive into a world where cutting-edge science meets leading digital technology platforms and data, all with a passion for impacting lives through data, analytics, AI, machine learning, and more.
Accountabilities
--------------------
Collaborate with project teams across diverse domains to understand their data needs and provide expertise in data ingestion and enrichment processes. Design, develop, and maintain scalable data pipelines and ETL workflows for the Knowledge Graph Team. Implement advanced data engineering techniques to ensure optimal performance and reliability of data systems. Work closely with data scientists and analysts to ensure high-quality data for knowledge graph construction and advanced analytics. Troubleshoot and resolve complex issues related to data pipelines, ensuring efficient data flow. Optimize data storage and processing for performance, scalability, and cost-efficiency. Stay updated with the latest trends in data engineering, analytics, and AWS DevOps to drive innovation. Provide DevOps/CloudOps support for the Knowledge Graph Team as needed.
Essential Skills/Experience
-------------------------------
Bachelor's or Master's degree in Computer Science, Engineering, or a related field
Strong hands-on programming experience (preferably Python)
Experience working with any relational database (e.g., PostgreSQL, MySQL, SQL Server)
Proficient in version control using GIT
Solid understanding of data engineering principles and standard methodologies
Desirable Skills/Experience
-------------------------------
Practical experience with Knowledge Graphs (RDF or LPG models)
Proficiency in graph query languages such as SPARQL, Cypher, or Gremlin
Hands-on experience with AWS services (e.g., S3, EC2, Lambda, Glue)
Experience working with Snowflake for data warehousing or analytics
Familiarity with Docker and containerized deployments
Experience with data transformation tools like DBT
Experience with data orchestration tools such as Apache Airflow
Understanding of CI/CD and DevOps practices
Knowledge of FAIR (Findable, Accessible, Interoperable, Reusable) data principles
When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world.
At AstraZeneca, we demonstrate technology to impact patients and ultimately save lives. As part of a purpose-led global organization, we push the boundaries of science to discover and develop life-changing medicines. Our work unlocks the potential of science by improving efficiencies and driving productivity through automation and data simplification. With investment behind us, there's no slowing us down--join us at a crucial stage of our journey in becoming a digital and data-led enterprise.
Ready to make a meaningful impact? Apply now and be part of our innovative team!
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.