Gcp Data Engineer

Year    Pune, Maharashtra, India

Job Description

b'

GCP Data Engineer About the Role Duration: 6 Month Location: Pune Timings: Full Time (As per company timings)
Notice Period: within 15 days or immediate joiner
Experience: 4-6 Years
Salary -9LPA-11LPA Requirement:
  • Design, build, and maintain scalable data pipelines and workflows using Apache Airflow on Google Cloud Platform (GCP).
  • Develop and optimize ETL processes to extract, transform, and load data into BigQuery for analysis and reporting purposes.
  • Collaborate with cross-functional teams to gather requirements, define data models, and implement solutions that meet business needs.
  • Monitor pipeline performance, troubleshoot issues, and implement enhancements to ensure reliability, efficiency, and data quality.
  • Stay updated on emerging technologies and best practices in data engineering and GCP ecosystem to drive continuous improvement and innovation.
  • Google Cloud Platform (GCP) technologies like Apache Airflow and BigQuery
Qualifications:
  • Bachelor\'s degree in Computer Science, Engineering, or related field.
  • Proven experience in building and managing data pipelines using Apache Airflow, preferably in a cloud environment (GCP).
  • Strong proficiency in SQL and experience working with BigQuery or other data warehouse solutions.
  • Experience with Python programming language and related libraries/frameworks for data manipulation and processing.
  • Familiarity with GCP services such as Cloud Storage, Dataflow, and Pub/Sub is a plus.
  • Excellent problem-solving skills, attention to detail, and ability to work effectively in a fast-paced, collaborative environment.
Other Personal Characteristics
  • Dynamic, engaging, self-reliant developer
  • Ability to deal with ambiguity
  • Manage a collaborative and analytical approach
  • Self-confident and humble
  • Open to continuous learning
  • Intelligent, rigorous thinker who can operate successfully amongst bright people
  • Be equally comfortable and capable of interacting with technologists as they are with business executives.
Linked in post Company Description HCT INFOTECH is a global IT consulting powerhouse headquartered in Florida, USA, with strategic offices in Asia and Africa. We specialize in integrated, reliable, and cost-effective solutions, delivering unparalleled value to numerous enterprises. Our areas of expertise include Cloud Services & Solutions, Software Development, Disaster Recovery Solutions, Backup Solutions, Cyber-Security, Managed Services, Networking, and ISO 27001 Compliance Certification. With a team of seasoned professionals boasting over 100 years of combined expertise, HCT INFOTECH is dedicated to propelling businesses forward through innovative solutions. Position: GCP Data Engineer Duration: 6 Months Location: Pune Timings: Full Time (As per company schedule) Notice Period: Within 15 days or immediate joiner Experience: 4-6 Years Salary: 9LPA-11LPA About the Role: Are you passionate about designing, building, and optimizing scalable data pipelines? Do you thrive in a collaborative environment where innovation and efficiency are valued? We are seeking a skilled GCP Data Engineer to join our team in Pune for a 6-month duration. As a Data Engineer, you will be responsible for designing, developing, and maintaining data pipelines and workflows on Google Cloud Platform (GCP), utilizing technologies such as Apache Airflow and BigQuery. Key Responsibilities:
  • Design, build, and maintain scalable data pipelines and workflows using Apache Airflow on Google Cloud Platform (GCP).
  • Develop and optimize ETL processes to extract, transform, and load data into BigQuery for analysis and reporting purposes.
  • Collaborate with cross-functional teams to gather requirements, define data models, and implement solutions that meet business needs.
  • Monitor pipeline performance, troubleshoot issues, and implement enhancements to ensure reliability, efficiency, and data quality.
  • Stay updated on emerging technologies and best practices in data engineering and GCP ecosystem to drive continuous improvement and innovation.
Requirements:
  • Bachelor\'s degree in Computer Science, Engineering, or related field.
  • Proven experience in building and managing data pipelines using Apache Airflow, preferably in a cloud environment (GCP).
  • Strong proficiency in SQL and experience working with BigQuery or other data warehouse solutions.
  • Experience with Python programming language and related libraries/frameworks for data manipulation and processing.
  • Familiarity with GCP services such as Cloud Storage, Dataflow, and Pub/Sub is a plus.
  • Excellent problem-solving skills, attention to detail, and ability to work effectively in a fast-paced, collaborative environment.
How to Apply: If you are ready to take on this exciting opportunity and meet the qualifications outlined above, please submit your resume to hiring@hctinfotech.com Job Type: Full-time Pay: \xe2\x82\xb975,000.00 - \xe2\x82\xb990,000.00 per month Schedule:
  • Day shift
Work Location: In person

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD3282817
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Pune, Maharashtra, India
  • Education
    Not mentioned
  • Experience
    Year