Lead Dataops & Automation Engineer

Year    Bangalore, Karnataka, India

Job Description


Summary Salary: Competitive Team: Data Science and Engineering Location: India - Bangalore IT Capability Centre



Data and analytics excellence at Dyson is delivered by a diverse and collaborative global community, at the heart of that community is a hub team that enables all others: In Dyson\'s Analytics Platform team, there has been significant investment into cloud technologies and tools; combined with an expansive scope and no shortage of ambition and momentum, Dyson Data Analytics Platform team is recognized throughout the organization to the highest level as critical to all of Dyson\'s strategic objectives.

With a \'one-team\' approach, the global community is on a mission to:

Evolve existing solutions to stay ahead.

Embed emerging solutions to capitalize on potential benefits.

Deliver conceptualized & future solutions to introduce net-new capability.

The Team

As the Analytics Platform delivering the data, technology and community provision enabling Dyson\'s global data and analytics capabilities, Dyson Analytics Platform team have end-to-end responsibility for the management of data platforms and integrations, enable our development and analytics teams to drive continuous features and capabilities for our customers, partners, and employees.

Dyson Analytics platform team is a multi-disciplinary, global team providing round-the-clock development and operations - including platform architecture, engineering, management, DataOps, governance, and advance analytics expertise.

Involved with every aspect of Dyson\'s global business - from finance to product development, manufacturing to owner experience - the team is seeking to deliver solutions generating impressive and tangible business value.

About the role:

Conduct the Data platform assessment & lay the CI/ CD & DataOps phase-wise implementation roadmap along with the Tools fitment.

Implement the CI/ CD Automation & DataOps in each of the Data Products in Optimum Delivery

Adhere to the defined DEV delivery process/ guidelines like Design & Development, Implementation, Change Management, SLA Compliance, productivity, and other application goals,

Develop innovative approaches on performance optimization & automation to Ensure timely update to stakeholders.

Understanding, learning, and applying new automated build, test and deployment capabilities and help develop project teams towards integrating such solutions.

Expand awareness, knowledge, and experience on automation within CI/CD pipelines.

Design, propose, facilitate organizational and process improvements as needed for supporting automation and DevSecOps and DataOps.

Participate in the design of service automation in cloud towards Infrastructure-as-code and engineering of new cloud/on-prem technologies.

Execution of process engineering and operational improvement initiatives for automation tooling focused on cloud

About you:

7+ Experience in Implementing DataOps, DevOps CI (continuous integration) and CD (continuous delivery) using Various tools.

5+ Experience in working on Jenkins, Ansible, Docker, Kubernetes, Bitbucket, GitLab, GitHub

Experience in setting up GIT by following the best practices for different dev teams.

Working Experience in one of AWS / GCP / Azure Platform is required.

Full life-cycle product development experience.

Experience in Integration of DevOps tools like Gitlab and Jenkins

Experience in setting up VCS to integrate smoothly with CI/CD pipeline.

Experience in working on MicroServices is a Plus.

Having good exposure in Monitoring Tools (DataDog, ELK, New Relic, Splunk, Dynatrace)

Strong skillset and experience establishing CI/CD pipelines using GCP DevOps services.

Strong skillset on integrating and operating Static and Dynamic Code Scans for security testing, OpenSource Code Scan with pipelines

Strong skillset on integrating Automated Function and Performance tests with in GCP DevOps pipelines.

Skills and experience in GCP along with scripting and development

Ability to provision, monitor, optimize and scale GCP infrastructure using API\'s.

Knowledge and/or experience on designing, developing, deploying, and CI/CD in a DevOps environment.

Experience with Docker containers and Kubernetes and other types of Microservices and Container technologies highly preferred

Hands on experience in the following Cloud services is mandatory Google Big Query,Google Cloud Storage,Google Cloud Functions,Google Dataflow,Google Cloud SQL,Google Firestore,Google Cloud Composer,Google Compute Engine,Google App Engine,Google Kubernetes Engine,Google Cloud Run,Google Data Cataloge,Google Pub/Sub,Google Vertex AI,Terraform,Terragrunt,dbt,AWS Redshift,S3 Buckets,Lambda Functions,Azure Blob Storage,Azure Database SQL,Datadog

Dyson is an equal opportunity employer. We know that great minds don\'t think alike, and it takes all kinds of minds to make our technology so unique. We welcome applications from all backgrounds and employment decisions are made without regard to race, colour, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other any other dimension of diversity.

Posted: 22 January 2024

Dyson

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD3239032
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Bangalore, Karnataka, India
  • Education
    Not mentioned
  • Experience
    Year