Gg Data Engineer

Year    KA, IN, India

Job Description

Astellas Pharma Inc. is a pharmaceutical company conducting business in more than 70 countries around the world. We are committed to turning innovative science into medical solutions that bring value and hope to patients and their families. Keeping our focus on addressing unmet medical needs and conducting our business with ethics and integrity enables us to improve the health of people throughout the world. For more information on Astellas, please visit our website at www.astellas.com .


This position is based in Bangalore, India. We recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with the Responsible Flexibility Guidelines.


Purpose and Scope:




As an Information X Data Engineer, you will play a crucial role in leading, designing, building, and maintaining our data infrastructure. Your expertise in Data engineering across multiple platforms including Azure/ AWS Cloud Data Warehousing, Databricks, PySpark, SQL, Business Intelligence (Qlik) , Application management and other related technologies, will be instrumental in enabling data-driven decision-making and outcomes. Working within small agile teams that are focused on delivering value, you will play a pivotal role in building, maintaining and enhancing our strategic systems across the organisation. This is a fantastic global opportunity to use your proven agile delivery skills across a diverse range of initiatives, utilise your development skills, and contribute to the continuous improvement/delivery of critical IT solutions.


Essential Skills & Knowledge:




Strong communication and collaboration skills, coupled with excellent problem-solving skills and attention to detail



Subject Matter Expertise: possess a strong understanding of data architecture/ engineering/operations/data model (e.g. data vault, dimensional data model).



Experience within Life Sciences/ Pharma /Manufacturing industry is preferred.



Proven experience in building robust data pipeline; experience in (near) real-time processing is preferred.



Technical Proficiency: Strong coding skills for example, Python, PySpark, SQL.



Engineering experience across multiple platforms for example AWS, Azure, Databricks or Change Data Capture (eg. Fivetran) is preferred.



Expertise in building data pipelines and strong understanding of data management best practices



Proficiency in network architecture and security concepts



Proven experience with data analytics practices and techniques and organizations



Agile Practices: Experience working in Agile development environments, participating in sprint planning, stand-ups, and retrospectives.



Cloud Data Solutions: Familiarity with other cloud platforms (AWS, Azure, Google Cloud) and their data services



Analytical Thinking: Demonstrated ability to lead ad hoc analyses, identify performance gaps, and foster a culture of continuous improvement



Agile Champion: Adherence to DevOps principles, automation, and a proven track record with CI/CD pipelines for continuous delivery



Understand and Interpret business requirements and can term them into technical requirements.



Responsibilities and Accountabilities:




Data Pipeline Development: Design, build, and optimize data pipelines using DWH technologies, Databricks, QLIK and other platforms as required. Ensuring data quality, reliability, and scalability.



Application Transition: Support the migration of internal applications to Databricks (or equivalent) based solutions. Collaborate with application teams to ensure a seamless transition.



Managing Continuous Improvement, Continuous Development, DevOps and RunOps activities at application, data and infra levels either on cloud or on premise.



Mentorship and Leadership: Lead and mentor junior data engineers. Share best practices, provide technical guidance, and foster a culture of continuous learning.



Data Strategy Contribution: Contribute to the organization's data strategy by identifying opportunities for data-driven insights and improvements.



Participate in smaller focused mission teams to deliver value driven solutions aligned to our global and bold move priority initiatives and beyond.



Design, develop and implement robust and scalable data analytics using modern technologies.



Collaborate with cross functional teams and practises across the organisation including Commercial, Manufacturing, Medical, FoundationX, GrowthX and support other X (transformation) Hubs and Practices as appropriate, to understand user needs and translate them into technical solutions.



Provide Technical Support to internal users troubleshooting complex issues and ensuring system uptime as soon as possible.



Champion continuous improvement initiatives identifying opportunities to optimise performance security and maintainability of existing data and platform architecture and other technology investments.



Participate in the continuous delivery pipeline. Adhering to DevOps best practises for version control automation and deployment. Ensuring effective management of the FoundationX backlog.



Leverage your knowledge of data engineering principles to integrate with existing data pipelines and explore new possibilities for data utilization.



Stay-up to date on the latest trends and technologies in data engineering and cloud platforms.



Experience:




At least 5+ years demonstrable experience in:


Data engineering with a strong understanding of PySpark and SQL, building data pipelines and optimization.


Data engineering and integration tools (e.g., Databricks, Change Data Capture)


Utilizing cloud platforms (AWS, Azure, GCP). A deeper understanding/certification of AWS and Azure is considered a plus.


Experience with relational and non-relational databases.



Qualifications:




Bachelor's degree in computer science, Information Technology, or related field (Master's preferred) or equivalent experience.


Any relevant cloud-based integration certification at associate or professional level. For example:


AWS certified DevOps engineer (Associate or Professional),



AWS Certified Developer (Associate or Professional)



DataBricks Certified Engineer



Qlik Sense Data Architect / Business Analyst (or similar platform)



Mulesoft Certified integration architect Level 1,



Microsoft Certified Azure Integration and Security.



Proficient in RESTful APIs



AWS, CDMP, MDM, DBA, SQL, SAP, TOGAF, API, CISSP, VCP (any relevant certification)



MuleSoft




Understanding of MuleSoft's Anypoint Platform and its components



Experience with designing and managing API-led connectivity solutions



Knowledge of integration patterns and best practices



AWS




Experience provisioning, operating, and managing AWS environments



Experience developing code in at least one high-level programming language



Understanding of modern development and operations processes and methodologies



Ability to automate the deployment and configuration of infrastructure using AWS services and tools



Experience with continuous integration and continuous delivery (CI/CD) methodologies and tools



Microsoft Azure




Fundamental understanding of Microsoft Azure and AWS and the data services provided



Experience with Azure services related to computing, networking, storage, and security



Knowledge of general IT security principles and best practices



Understanding of cloud integration patterns and Azure integration services such as Logic Apps, Service Bus, and API Management



Preferred Qualifications:




Subject Matter Expertise: possess a strong understanding of data architecture/ engineering/operations/ reporting within Life Sciences/ Pharma industry across Commercial, Manufacturing and Medical domains.



Other complex and highly regulated industry experience will also be considered for e.g. healthcare, government or financial services.



Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools



Analytical Thinking: Demonstrated ability to lead ad hoc analyses, identify performance gaps, and foster a culture of continuous improvement.



Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization.



Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery.



Other critical skills required




Cross-Cultural Experience: Work experience across multiple cultures and regions, facilitating effective collaboration in diverse environments.



Innovation and Creativity: Ability to think innovatively and propose creative solutions to complex technical challenges.



Global Perspective: Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service and enabling strategic insights and decision-making.



Category


Astellas is committed to equality of opportunity in all aspects of employment.


EOE including Disability/Protected Veterans

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD3719115
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    KA, IN, India
  • Education
    Not mentioned
  • Experience
    Year