Data Engineer

Year    KA, IN, India

Job Description

:
Business Title
Data Engineer
Years of Experience
Min 3 and max upto 7.
Job Descreption
We are looking for a highly skilled GCP Data Engineer with strong programming fundamentals and cloud-native architecture experience. The candidate should have hands-on expertise in building scalable data pipelines on Google Cloud Platform, integrating external APIs, and applying software engineering principles to data workflows.
Must have skills
1. 3 to 7 years of experience in data engineering, with at least 1+ years on Google Cloud Platform.

2. Strong proficiency in Python, including OOP principles, modular design, and clean coding standards.

3. Hands-on with GCP services: BigQuery, Cloud Functions, Cloud Run, Cloud Build, Dataform, Pub/Sub, Eventarc, Cloud Storage, and Cloud Composer.

4. Solid understanding of SQL, data modeling, and data warehousing concepts.

5. Familiarity with CI/CD pipelines, Git-based workflows etc.

6. Strong problem-solving skills and ability to work independently in a fast-paced environment.
Good to have skills
1.Familiarity with other Cloud technologies

2.Working experience on JIRA and Agile

3.Stakeholder communication

4. Microsoft Office

5. Cross functional team work internally and with external clients

6. Team Lead

7. Requirement gathering
Key responsibiltes
1. Implement modular, object-oriented Python applications for data ingestion and transformation.

2. Strong Object-Oriented Programming skills. Proficient in implementing scalable, maintainable data pipelines using core OOP principles such as encapsulation, inheritance, polymorphism, and abstraction. Experienced in building modular systems with well-defined class hierarchies, applying SOLID principles to ensure clean architecture and separation of concerns. Skilled in leveraging design patterns to solve complex problems and enhance code readability and testability.

3. Build and maintain ETL/ELT pipelines using GCP services: BigQuery, Cloud Functions, Cloud Run, Cloud Composer, and DataForm.

4. Integrate with external APIs to extract and load data into BigQuery.

5. Develop event-driven architectures using Pub/Sub and Eventarc to trigger workflows and manage real-time data streams.

6. Automate deployment and testing workflows using Cloud Build and CI/CD pipelines.

7. Optimize performance and cost of data processing and storage across GCP services.

8. Collaborate with cross-functional teams to translate business requirements into scalable data solutions.

9. Ensure data security, governance, and compliance across all pipelines and storage layers.
Education Qulification
1. Bachelor's or Master Degree or equivalent Degree
Certification If Any
1. GCP Professional Data Engineer Certification.

2. Snowflake Associate / Core
Shift timing
12 PM to 9 PM and / or 2 PM to 11 PM - IST time zone
Location:
DGS India - Bengaluru - Manyata H2 block
Brand:
Merkle
Time Type:
Full time
Contract Type:
Permanent

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD4818717
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    KA, IN, India
  • Education
    Not mentioned
  • Experience
    Year