Manager

Year    Gurgaon, Haryana, India

Job Description


Job Location: Bangalore/Gurgaon Shift Timing: 12:30PM IST - 10:30 PM IST Experience: 5+ years EXL Company Overview: EXL (NASDAQ: EXLS) is a leading operations management and analytics company that designs and enables agile, customer-centric operating models to help clients improve their revenue growth and profitability. Our delivery model provides market-leading business outcomes using EXL\'s proprietaryBusiness EXLerator Framework , cutting-edge analytics, digital transformation and domain expertise. At EXL, we look deeper to help companies improve global operations, enhance data-driven insights, increase customer satisfaction, and manage risk and compliance. EXL serves the insurance, healthcare, banking and financial services, utilities, travel, transportation and logistics industries. Headquartered in New York, New York, EXL has more than 32,000 professionals in locations throughout the United States, Europe, Asia (primarily India and Philippines), South America, Australia and South Africa. EXL Analytics provides data-driven, action-oriented solutions to business problems through statistical data mining, cutting edge analytics techniques and a consultative approach. Leveraging proprietary methodology and best-of-breed technology, EXL Analytics takes an industry-specific approach to transform our clients\' decision making and embed analytics more deeply into their business processes. Our global footprint of nearly 2,000 data scientists and analysts assist client organizations with complex risk minimization methods, advanced marketing, pricing and CRM strategies, internal cost analysis, and cost and resource optimization within the organization. EXL Analytics serves the insurance, healthcare, banking, capital markets, utilities, retail and e-commerce, travel, transportation and logistics industries. Please visit for more information about EXL Analytics. Role Overview: EXL provides consulting and analytics support to fortune 500 companies across multiple industry domains. For this role, you will be supporting the data engineering team of a leading US retail firm. While working on GCP, you are expected to develop data pipeline and/or warehousing solutions . Some of your responsibilities include - Be an integral part of large-scale client business development and delivery engagements Work across all phases of SDLC, and use Software Engineering principles to build scaled solutions Build the knowledge base required to deliver increasingly complex technology projects Team handling, problem solving, project management and communication skills & creative thinking Build and maintain data pipeline (Batch/Streaming), complex data transformation, and analytical components Develop and Maintain Datawarehouse / Data Lakes Build analytics software products that utilize the data pipeline to provide actionable insights Assemble large, complex data sets that meet functional / non-functional business requirements Understand and translate business needs into data models supporting long-term solutions Analyse data-related system integration challenges and propose appropriate solutions Ability to perform data validation processes Work with other team members to accomplish key software development tasks Work with the operation support team on transition and stabilization Eligibility: Master\'s or Bachelor\'s degree in Math, Statistics, Economics, Computer Science or related analytics field from top-tier universities with strong record of achievement The candidate must have overall 5+ years of experience and 3-4 years using GCP BigQuery, Cloud Composer / Airflow, Data Flow / Apache Beam, Data Proc, Cloud Storage, Pub/Sub, Data Fusion, Cloud Functions, GCP Data Transfer, gcloud CLI, Google Cloud Python SDK Strong Knowledge of database concepts, Data Modelling in RDBMS Vs NoSQL, OLTP Vs OLAP, MPP Architecture Experience with relational SQL & Columnar Databases Experience working with ETL Tools like Data Fusion, Fivetran, Talend, Matillion, etc Excellent command over SQL, Python & PySpark Exposure to other streaming/messaging application like Kafka, MQ, etc Strong Analytical skills working with a large store of Databases and Tables Knowledge of data Modelling, database design, and the data warehousing ecosystem Very good knowledge on Data Quality Management and ability to perform data validation Good exposure CI/CD methodologies and using tools such as Cloud Build/Run/Source, Jenkins, Git etc Strong verbal and business communication skills Strong business acumen & demonstrated aptitude for analytics that incite action . Effective time management and attention to detail Good to Have: Good exposure and hands on knowledge on Datawarehouse / Data Lake solutions on-premise Experience withNoSQL, Columnar Databases On-Prem to GCP DW, ETL & spark workloads migration/conversion Proficiency in Linux/Unix shell scripting Experience in monitoring and alerting using GCP services GCP Professional Data Engineer certification Preferred. Some experience on AWS services like Redshift & Glue Experience with Kubernetes. Experience with Docker. Working experience on Reporting tools.

foundit

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD3119532
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Gurgaon, Haryana, India
  • Education
    Not mentioned
  • Experience
    Year