Data Engineer Ii

Year    Gurgaon, Haryana, India

Job Description



At Netomi AI, we are on a mission to create artificial intelligence that builds customer love for the world\xe2\x80\x99s largest global brands.


Some of the largest brands are already using Netomi AI\xe2\x80\x99s platform to solve mission-critical problems. This would allow you to work with top-tier clients at the senior level and build your network.



Backed by the world\xe2\x80\x99s leading investors such as Y-Combinator, Index Ventures, Jeffrey Katzenberg (co-founder of DreamWorks) and Greg Brockman (co-founder & President of OpenAI/ChatGPT), you will become a part of an elite group of visionaries who are defining the future of AI for customer experience. We are building a dynamic, fast growing team that values innovation, creativity, and hard work. You will have the chance to significantly impact the company\xe2\x80\x99s success while developing your skills and career in AI.


Want to become a key part of the Generative AI revolution? We should talk.



Netomi is seeking a highly analytical and detail-oriented candidate to join the Analytics team in Gurugram. As part of the team, you will work with product, engineering, and customer success teams to drive complex data and trend analyses to propose ways to improve and, thereby contributing to improve the experience. You will also be responsible for benchmarking and measuring the performance of various product operations projects, building and publishing detailed scorecards and reports, identifying and driving new opportunities based on customer and business data.


We are looking for a Data Engineer with a passion for using data to discover and solve real-world problems. You will enjoy working with rich data sets, modern business intelligence technology, and the ability to see your insights drive the features for our customers. You will also have the opportunity to contribute to the development of policies, processes, and tools to address product quality challenges in collaboration with teams.

Responsibilities

  • You will partner with teammates to create complex data processing pipelines in order to solve our clients\xe2\x80\x99 most ambitious challenges
  • You will pair to write clean and iterative code based on TDD
  • Leverage various continuous delivery practices to deploy, support and operate data pipelines
  • Advise and educate clients on how to use different distributed storage and computing technologies from the plethora of options available
  • Develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions
  • Create data models and speak to the tradeoffs of different modeling approaches
  • Seamlessly incorporate data quality into your day-to-day work as well as into the delivery process

Requirements

  • Minimum 4 years of work experience with a start-up mentality and high willingness to learn.
  • Solid knowledge of Java or Python & good understanding of tech stack with any web framework (Spring, Django, etc) to write maintainable, scalable, unit-tested code.
  • Expertise in SQL & strong understanding of databases (e.g. MySQL, PostgreSQL) and NoSQL databases.
  • You have a good understanding of data modeling.
  • Work experience as an ETL developer would be preferable.
  • Experience with data engineering tools and platforms such as Kafka, Druid, AWS Kinesis, Spark and Hadoop is desirable.
  • You have built large-scale data pipelines and data-centric applications using any of the distributed storage platforms such as HDFS, S3, NoSQL databases (Hbase, Cassandra, etc.) and any of the distributed processing platforms like Hadoop, Spark, Hive and Airflow in a production setting.
  • You are comfortable taking data-driven approaches and applying data security strategy to solve business problems.
  • You\xe2\x80\x99re genuinely excited about data infrastructure and operations with a familiarity working in cloud environments
  • Working with data excites you: you can build and operate data pipelines, and maintain data storage, all within distributed systems.
  • Assure effective collaboration between Netomi and the business client\xe2\x80\x99s teams, encouraging open communication and advocating for shared outcomes.
  • Experience writing data quality unit and functional tests.

Netomi is an equal opportunity employer committed to diversity in the workplace. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, disability, veteran status, and other protected characteristics.

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD3188865
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Gurgaon, Haryana, India
  • Education
    Not mentioned
  • Experience
    Year