to design, build, and maintain data pipelines and data models that support analytical and business intelligence needs. The ideal candidate will have hands-on experience with
Python or SQL
,
Google Cloud Platform (GCP)
, and a strong understanding of
data management, quality, and security best practices
.
Key Responsibilities
Build and maintain
moderately complex data pipelines
, ensuring data flow, transformation, and usability for analytical projects.
Design and implement data models
, optimizing for performance and scalability.
Apply knowledge of
data characteristics and supply patterns
to develop rules and tracking processes that support data quality models.
Prepare data for analytical use by gathering, integrating, cleansing, and structuring data from multiple sources and systems.
Perform design, creation, and interpretation of large and highly complex datasets.
Troubleshoot pipeline and data issues to ensure accuracy and reliability.
Stay up-to-date with
GCP advancements
and recommend innovative solutions.
Implement
security best practices
within data pipelines and cloud infrastructure.
Collaborate with global teams to share and adopt best practices in data management, maintenance, reporting, and security.
Develop and execute
data quality checks
to ensure consistency and integrity.
Work with
credit data products
and perform analysis using tools like
Google BigQuery, BigTable, DataFlow, and Spark/PySpark
.
Mandatory Skills
Python or SQL Proficiency:
Experience with Python or SQL and intermediate scripting for data manipulation and processing.
GCP & Cloud Fundamentals:
Intermediate understanding and experience with
Google Cloud Platform (GCP)
and overall cloud computing concepts.
Data Pipeline Construction:
Proven ability to build, maintain, and troubleshoot moderately complex pipelines.
Data Modeling & Optimization:
Experience designing and optimizing data models for performance.
Data Quality Governance:
Ability to develop rules, tracking processes, and checks to support a data quality model.
Data Preparation & Structuring:
Skilled in integrating, consolidating, cleansing, and structuring data for analytical use.
Security Implementation:
Knowledge of security best practices in pipelines and cloud infrastructure.
Big Data Analysis Tools:
Hands-on experience with
Google BigQuery, BigTable, DataFlow, Scala + Spark or PySpark
.
Advanced Data Formats:
Experience working with
JSON, AVRO, and PARQUET
formats.
Communication & Best Practices:
Strong communication skills to promote global best practices and guide adoption.
Preferred Qualifications
Cloud certification
(e.g., GCP Data Engineer, AWS, Azure).
Experience with
UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world's best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients' organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact--touching billions of lives in the process.
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.