Overview:
We are looking for a skilled GCP Data Engineer with 3 to 10 years of real hands-on
experience in data ingestion, data engineering, data quality, data governance, and cloud
data warehouse implementations using GCP data services. The ideal candidate will be
responsible for designing and developing data pipelines, participating in architectural
discussions, and implementing data solutions in a cloud environment.
Key Responsibilities:
? Collaborate with stakeholders to gather requirements and create high-level and
detailed technical designs.
? Develop and maintain data ingestion frameworks and pipelines from various data
sources using GCP services.
? Participate in architectural discussions, conduct system analysis, and suggest
optimal solutions that are scalable, future-proof, and aligned with business
requirements.
? Design data models suitable for both transactional and big data environments,
supporting Machine Learning workflows.
? Build and optimize ETL/ELT infrastructure using a variety of data sources and GCP
services.
? Develop and implement data and semantic interoperability specifications.
? Work closely with business teams to define and scope requirements.
? Analyze existing systems to identify appropriate data sources and drive continuous
improvement.
? Implement and continuously enhance automation processes for data ingestion and
data transformation.
? Support DevOps automation efforts to ensure smooth integration and deployment of
data pipelines.
? Provide design expertise in Master Data Management (MDM), Data Quality, and
Metadata Management.
Skills and Qualifications:
? Overall 3-10 years of hands-on experience as a Data Engineer, with at least 2-3
years of direct GCP Data Engineering experience.
? Strong SQL and Python development skills are mandatory.
? Solid experience in data engineering, working with distributed architectures,
ETL/ELT, and big data technologies.
? Demonstrated knowledge and experience with Google Cloud BigQuery is a must.
? Experience with DataProc and Dataflow is highly preferred.
? Strong understanding of serverless data warehousing on GCP and familiarity with
DWBI modeling frameworks.
? Extensive experience in SQL across various database platforms.
? Any BI tools Experience is also preferred.
? Experience in data mapping and data modeling.
? Familiarity with data analytics tools and best practices.
? Hands-on experience with one or more programming/scripting languages such as
Python, JavaScript, Java, R, or UNIX Shell.
? Practical experience with Google Cloud services including but not limited to:
o BigQuery, BigTable
o Cloud Dataflow, Cloud Dataproc
o Cloud Storage, Pub/Sub
o Cloud Functions, Cloud Composer
o Cloud Spanner, Cloud SQL
? Knowledge of modern data mining, cloud computing, and data management tools
(such as Hadoop, HDFS, and Spark).
? Familiarity with GCP tools like Looker, Airflow DAGs, Data Studio, App Maker,
etc.
? Hands-on experience implementing enterprise-wide cloud data lake and data
warehouse solutions on GCP.
? GCP Data Engineer Certification is highly preferred.
Job Type: Full-time
Pay: ?500,298.14 - ?1,850,039.92 per year
Benefits:
Health insurance
Schedule:
Rotational shift
Work Location: In person
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.