Engineer Ii

Year    Bangalore, Karnataka, India

Job Description


About Neiman Marcus GTS

The Neiman Marcus Group is a luxury retailer and a relationship business that leads with love in everything we do for our customers, associates, brand partners, and communities. As we continue investing in new technologies, our Neiman Marcus Global Technology Services (NM GTS) center in Bangalore, India, is helping us with rapid digital transformation efforts to deliver the best integrated customer experience across our stores and our online and remote digital selling points. If you are curious and passionate about technology that is transforming and shaping tomorrow\'s luxury retail, interested in a role change and would want to enjoy the work culture of a 100-year-old luxury retailer, then you will find a home with us. At NM GTS you know your purpose, aim for mastery and autonomy to take ownership of your career. For more information, visit neimanmarcusgroup.com.



Neiman Marcus Group has an immediate opening for aData Engineer.Data engineer will have the unique combination of business acumen needed to interface directly with key stakeholders to understand the problem along with the skills and vision to translate the need into a world-class technical solution using the latest technologies

This person will be a hands-on role who is responsible for building data engineering solutions for NMG Enterprise using cloud-based data platform.Data engineer will provide day-to-day technical deliverables and participate in technical design, development and support for data engineering workloads. In this role, you need to be equally skilled with the whiteboard and the keyboard.

Essential Duties & Responsibilities:

  • Understand and Analyze data from multiple data sources and develop technology to integrate the enterprise data layer
  • Create robust and automated pipelines to ingest and process structured and unstructured data from source systems into analytical platforms using batch and streaming mechanisms leveraging cloud native toolset
  • Work activity includes processing complex data sets, leveraging technologies used to process these disparate data sets and understanding the correlations as well as patterns that exist between these different data sets
  • Implement orchestrations of data pipelines and environment using Airflow
  • Implement custom applications using the Kinesis, Lambda and other AWS toolset as required to address streaming use cases
  • Implement automation to optimize data platform compute and storage resources
  • Develop and enhance end to end monitoring capability of cloud data platforms
  • Participate in educating and cross training other team members
  • Provide regular updates to all relevant stakeholders
  • Participate in daily scrum calls and provide clear visibility to work products
Qualifications/Competencies:
  • Bachelors degree with emphasis on Computer Science, Engineering, Information Systems, Mathematics, Statistics, or related discipline.
  • 6+ years of experience in the data engineering and analytic space
  • 5+ years of Python experience. Solid programing experience in Python - needs to be an expert in this 4/5 level. (Must have strong Python skills, along with lambdas and Airflow Dag processing.)
  • 8+ years of RDBMS concepts with strong data analysis and SQL experience
  • 3+ years of Linux OS command line tools and bash scripting proficiency
  • 1+ year of experience working on Big Data Processing Frameworks and Tools
  • Exposure to software engineering such as parallel data processing, data flows, REST
  • 2+ years worked on real-time data capture, processing and storing using technologies like Kafka, AWS Kinesis.
  • 2+ year worked on AWS technology and services
  • APIs, JSON, XML, and micro service architectures
  • Certification preferably AWS Certified Big Data or any other cloud data platforms, big data platforms
Nice to have:
  • Cloud data warehouse experience - Snowflake is a plus
  • Data Modeling experience a plus
Essential Duties & Responsibilities:
  • Understand and Analyze data from multiple data sources and develop technology to integrate the enterprise data layer
  • Create robust and automated pipelines to ingest and process structured and unstructured data from source systems into analytical platforms using batch and streaming mechanisms leveraging cloud native toolset
  • Work activity includes processing complex data sets, leveraging technologies used to process these disparate data sets and understanding the correlations as well as patterns that exist between these different data sets
  • Implement orchestrations of data pipelines and environment using Airflow
  • Implement custom applications using the Kinesis, Lambda and other AWS toolset as required to address streaming use cases
  • Implement automation to optimize data platform compute and storage resources
  • Develop and enhance end to end monitoring capability of cloud data platforms
  • Participate in educating and cross training other team members
  • Provide regular updates to all relevant stakeholders
  • Participate in daily scrum calls and provide clear visibility to work products

Neiman Marcus GTS

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD3010420
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Bangalore, Karnataka, India
  • Education
    Not mentioned
  • Experience
    Year