Lead Data Engineer

Year    Bengaluru, Karnataka, India

Job Description

b'


Your Job The Lead Data Engineer will report to the Data Engineering & BI Lead of the KGSI and will be responsible to develop and implement a future-state data analytics platform for both the back-end data processing and the front-end data visualization component for the Finance Data Delivery teams. This will be a hands-on role and the candidate will be responsible for designing & development of data frameworks. Our Team The Lead Data Engineer will be a part of KGSI team that designs, develops, and delivers data engineering solutions leveraging the latest data engineering technologies for Koch Industries.Koch Industries is a privately held global organization with over 120,000 employees around the world, with subsidiaries involved in manufacturing, trading, and investments. Koch Technology Center (KTC) is being developed in India to extend its IT operations, as well as act as a hub for innovation in the IT function. As KGSI rapidly scales up its operations in India, it\'s employees will get opportunities to carve out a career path for themselves within the organization. This role will have the opportunity to join on the ground floor and will play a critical part in helping build out the KGSI (Koch global services india) over the next several years. Working closely with global colleagues would provide significant global exposure to the employees. What You Will Do
  • Be part of the data team to design and build a data engineering solutions.
  • Implement batch and near real time data movement design patterns and define best practices in data engineering.
  • Design and develop optimal cloud data solutions (lakes, warehouses, marts, analytics) by collaborating with diverse IT teams including business analysts, project managers, architects, and developers.
  • Work closely with a team of data architects, data engineers, BI developers and data analysts to procure, blend and analyze data for quality and distribution; ensuring key elements are harmonized and modeled for effective analytics, while operating in a fluid, rapidly changing data environment
  • Build data pipelines from a wide variety of sources
  • Demonstrate strong conceptual, analytical, and problem-solving skills and ability to articulate ideas and technical solutions effectively to external IT partners as well as internal data team members
  • Work with cross-functional teams, on-shore/off-shore, development/QA teams/Vendors in a matrixed environment for data delivery.
  • Strong troubleshooting skills.
  • Update and maintain key data cloud solution deliverables and diagrams
  • Ensure conformance and compliance using KGSI architecture guidelines and enterprise data strategic vision
Who You Are (Basic Qualifications)
  • Bachelor\'s degree in Computer Science, Engineering, or related IT area with at least 10+ years of experience in software development.
  • Primary Skill set : Data Engineering, Python (Especially strong in Object oriented Programming concepts), AWS (Glue, Lambda, EventBridge, Step functions and serverless architecture),Columnar DB(Redshift or Snowflake)

Secondary Skill set : Working with APIs, Spark, GIT/CICD, SQL,SPARK,STEP FUNCTIONS
  • At least 5+ years of hands-on experience in designing, implementing, managing large-scale and ETL solutions.
  • 2+ Years of experience in the people management
  • At least 3 years of hands-on experience in data modelling, data engineering, ETL, multi-dimensional data warehouses, cubes, with expertise in relevant languages and frameworks like SQL, Python etc.
  • Hands on experience with designing and fine-tuning queries in Columnar DB(Redshift or Snowflake)

  • Understanding of common DevSecOps/DataOps and CICD processes, methodologies, and technologies like GitLab, Terraform etc
  • Be able to analyze large complex data sets to resolve data quality issues
What Will Put You Ahead
  • AWS certifications like Solution Architect (SAA/SAP) or Data Engineering Associate (DEA)

  • Hands-on experience with AWS data technologies and at least one full life cycle project experience in building a data solution in AWS.
  • Exposure to visualization tools, such as Tableau or PowerBI.
  • Exposure with OLAP technologies and Data Virtualization (using Denodo)

  • Knowledge on manufacturing domain is preferable
At Koch companies, we are entrepreneurs. This means we openly challenge the status quo, find new ways to create value and get rewarded for our individual contributions. Any compensation range provided for a role is an estimate determined by available market data. The actual amount may be higher or lower than the range provided considering each candidate\'s knowledge, skills, abilities, and geographic location. If you have questions, please speak to your recruiter about the flexibility and detail of our compensation philosophy. Who We Are {Insert company language from Company Boilerplate Language Guide } At Koch, employees are empowered to do what they do best to make life better. Learn how our business philosophy helps employees unleash their potential while creating value for themselves and the company.

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD3242210
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Bengaluru, Karnataka, India
  • Education
    Not mentioned
  • Experience
    Year