Senior Data Integration Engineer

Year    Bangalore, Karnataka, India

Job Description


Your Job You\'re raising the stakes for your career to do more. Learn more. Impact more. Here, your innovation, ideas, and technical curiosity will help us deliver better care for billions of people worldwide. You\'ll put your professional expertise, talent, and drive to work by building and managing the technology behind our portfolio of iconic brands, which in turn helps billions of people around the world. It starts with YOU. About Us Huggies . Kleenex . Cottonelle . Scott . Kotex . Poise . Depend . K-C Professional . You already know our legendary brands-and so does the rest of the world. In fact, 25% of people in the world use Kimberly-Clark products every day, and it takes the absolute best people to make that happen. We\'re founded on 150 years of market leadership, and we\'re always looking for new and better ways to perform, especially when it comes to product and process innovation. Our customers are always looking for new and better. Our competitors won\'t stop evolving. And our communities demand responsible corporate practices. We need bold, transformative ideas from people who can turn them into reality. That means there\'s no time like the present to make an impact here. you just need to log on! Led by Purpose, Driven by You. About You You were made to do this work: designing new technologies, diving into data, optimizing digital experiences, and constantly developing better, faster ways to get results. You want to be part of a performance culture dedicated to building technology for a purpose that matters. You want to work in an environment that promotes sustainability, inclusion, wellbeing, and career development. Responsibilities: Work with Technical architects, Product Owners, and Business teams to translate requirements into technical design for data modelling and data integration Demonstrate deep background in data warehousing, data modelling and ETL/ELT data processing patterns Design and develop ETL/ELT pipelines with reusable patterns and frameworks Design and build efficient SQLs to process and curate the data sets in HANA, Azure, and Snowflake Design and review data ingestion frameworks leveraging Python, Spark, Azure Data Factory, Snowpipe, etc., Design and build Data Quality models and ABCR frameworks to ingest, validate, curate, and prepare the data for consumption Understand the functional domain, business needs and able to identify the gaps in the requirements proactively prior to implementing solutions Work with platform teams to design and build processes for automation in pipeline build, testing and code migrations Demonstrate exceptional impact in delivering projects, products and/or platforms in terms of scalable data processing and application architectures, technical deliverables, and delivery throughout the project lifecycle. Provide design and guiding principles on building data models and semantic models in Snowflake - enabling true self-service Responsible for ensuring the effectiveness of the ingestion and data delivery frameworks and patterns. Build and maintain data development standards and principles, provide guidance and project specific recommendations as appropriate Must be conversant with DevOps delivery approach and tools and have a track record of delivering products in agile model. Provide insight and direction on roles and responsibilities required for platform/ product operations Qualifications: 7+ years of experience designing, developing, and building ETL/ELT pipelines, procedures, and SQLs on MPP platforms such as HANA and Snowflake 7+ years of experience in Data warehousing and Business Intelligence. Experience in Data Warehousing concepts like Star schema, Snowflake schema, Fact table and Dimension Table. Experience in using various features of ADF to load data into snowflake. Experience in designing and building metadata driven data ingestion frameworks, building Azure Data Factory, SnowSQL, Snowpipe - as well as building mini-batch, real-time and event-driven data processing jobs Hands-on experience with Azure Familiarity in leveraging Azure Stream Analytics, Azure Analysis Services, Data Lake Analytics, HDInsight, HDP, Spark, Databricks, MapReduce, Pig, Hive, Tez, SSAS, Watson Analytics, SPSS Strong Knowledge on source code management, configuration management, CI/CD, security, and performance. Ability to look ahead to identify opportunities and thrive in a culture of innovation Self-starter who can see the big picture, and prioritize your work to make the largest impact on the business\' and customer\'s vision and requirements Experience in building, testing, and deploying code to run on Azure cloud data lake Ability to Lead/nurture/mentor others in the team. A can-do attitude in anticipating and resolving problems to help your team to achieve its goals. Must have experience in Agile development methods Preferred: Knowledge of Change Data Capture on both Source and Target, and implementation of the SCD (Slowly changing Dimension) Type 1, Type 2, and Type 3. Experience in modelling data warehouse in Snowflake. Experience in Azure logic apps and power automate. Proficient in distributed computing principles, modular application architecture, and various types of data processing patterns - real-time, batch, lambda, and other architectures Experience with a broad range of data stores - Object stores (Azure ADLS, HDFS, GCP Cloud Storage), Row and Columnar databases (Azure SQL DW, SQL Server, Snowflake, Teradata, PostgreSQL, Oracle), NoSQL databases (Cosmos DB, MongoDB, Cassandra), Elasticsearch, Redis, and Data processing platforms - Spark, Databricks, and SnowSQL You love what you do, especially when your work makes a difference. At Kimberly-Clark, we\'re constantly exploring new ideas on how, when, and where we can best achieve results. When you join our team, you\'ll experience Flex That Works: flexible (hybrid) work arrangements that empower you to have purposeful time in the office and partner with your leader to make flexibility work for both you and the business. To Be Considered Click the Apply button and complete the online application process. A member of our recruiting team will review your application and follow up if you seem like a great fit for this role. In the meantime, check out the . You\'ll want to review this and come prepared with relevant questions if and when you pass GO and begin interviews.

foundit

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD3043432
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Bangalore, Karnataka, India
  • Education
    Not mentioned
  • Experience
    Year