Does (The tasks / responsibilities that the role performs to address
requirements in Key Result Areas).
Design and develop data pipelines that will ingest the data from the
respective source systems into the data warehouse
Support and optimize existing data pipelines to ensure that the
downstream SLAs are met
Responsible in attending agile ceremonies as part of the Agile Tribe
set-up and delivering the different demands of the Tribe related to data
ingestions to the data warehouse platform
Ensures data quality within the data warehouse tables
Ensures the data engineering standards and practices are followed like
code version history, readability, functionality, extendibility, test runs,
deployments, and post deployment support
Supports L2 Data Operations team and act as L3 support
Manage data storage, computing unit requirements, and data security
within the data warehouse
Gather data requirements coming from Tribe demands
Supports other data engineers on other Tribes or domains if necessary
Demonstrate technical excellence when it comes to the data
warehouse platform
Supports stakeholders of the data warehouse platform by solving data
related issues and conducting data investigations
Delivers (The specific outputs / tangible results produced by the role;
resources responsible for)
Data Delivery Documentations - Data Dictionary, Data Mapping
Documents, Data ModelsOptimize Data Pipelines
Project delivery and operations support
Skills
Graduate of any IT related courses like BS Computer Science,
Information Technology or the likes
Strong SQL and ETL development experience, preferably related to
Data Warehousing, Business Intelligence, or other Data Delivery
projects.
Proficiency in optimizing data pipelines in order to meet SLA
requirements and save cost
Experience in Big Data, Hadoop, EMR, Hive, and cloud environments
is a plus
Proficiency in Python, R, Java, Scala is a plus
Proficiency in data modeling, data integrations, data quality, and data
architecture is a plus
AWS data integrations (Glue, Athena)
Spark, Databricks
Delta lake, Data Lake
Kafka, Confluent, Real-time processing, Data Streaming
4-6 years of experience in IT project delivery or data development
delivery particularly in Data Warehousing or Business Intelligence
About RCG Global Services
At Myridius, we transform the way businesses operate. Formerly known as RCG Global Services, our more than 50 years of expertise now drive a new vision--propelling organizations through the rapidly evolving landscapes of technology and business. We offer tailored solutions in AI, data analytics, digital engineering, and cloud innovation, addressing the unique challenges each industry faces. Our integration of cutting-edge technology with deep domain knowledge enables businesses to seize new opportunities, drive significant growth, and maintain a competitive edge in the global market. Our commitment is not just to meet expectations but to exceed them, ensuring measurable impact and fostering sustainable innovation. The success of Myridius is directly tied to the breakthroughs achieved by our clients. Together, we co-create solutions that not only solve today's challenges but also anticipate future trends. At Myridius, we go beyond typical service delivery. We craft transformative outcomes that help businesses not just adapt, but thrive in a world of continuous change. Discover how Myridius can elevate your business to new heights of innovation. Visit us at www.myridius.com and start leading the change.
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.