Data Engineer (l3)

Year    UP, IN, India

Job Description

Data Engineer - (L3) || GCP Certified



Experience Level:

4-7 years

Location:

Noida Office or at Client Site as Required

Employment Type:

Full-Time

Work Mode:

In-office/ Hybrid

Notice:

Immediate joiners

Client Profile:

A leading technology company

As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development "scrums" and solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design.

Required Skills:

Design, develop, and support data pipelines and related data products and platforms. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms. Perform application impact assessments, requirements reviews, and develop work estimates. Develop test strategies and site reliability engineering measures
for data products and solutions. Participate in agile development "scrums" and solution reviews.

Mentor junior Data Engineers. Lead the resolution of critical operations issues, including post-implementation reviews. Perform technical data stewardship tasks, including metadata management, security, and privacy by design. Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies Demonstrate SQL and database proficiency in various data engineering tasks. Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. Develop Unix scripts to support various data operations. Model data to support business intelligence and analytics initiatives. Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have).

Qualifications:



Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field. 4+ years of data engineering experience. 2 years of data solution architecture and design experience. GCP Certified Data Engineer (preferred).
Job Type: Full-time

Pay: Up to ?1,400,000.00 per year

Application Question(s):

What is your notice period (in days)? What is your current annual salary (in INR)? What is your expected annual salary (in INR)?
Experience:

designing, developing, and supporting data pipelines : 4 years (Required) developing test strategies & measures for data products : 5 years (Required) GCP Data Technologies: 4 years (Required) SQL and database : 5 years (Required) agile development "scrums" and solution reviews: 4 years (Required) automation of data workflow by setting up DAGs : 5 years (Required)
Location:

Noida, Uttar Pradesh (Required)
Work Location: In person

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD3702707
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Contract
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    UP, IN, India
  • Education
    Not mentioned
  • Experience
    Year