Senior Engineer Ii

Year    Bangalore, Karnataka, India

Job Description

Job Title:

Senior Engineer II

Keywords:

Required Skills: Api, Coding, Cost Reduction, Data Analytics, Data Conversion

Additional Skills: Data Migration, Data Sources, Data Warehouse, Databases, Engineer, Etl, Facets, Frameworks, Hadoop, Integration, J2ee, Java, Kafka, Life Cycle, Mongodb, Mpp, Mysql, No-Sql, Process Improvements, Pyspark, Python, Software Development, Software Development Life Cycle, Sql, Sql Server, System Integration, Visual Studio, .Net, Apache Kafka, Architecture, Blueprints, Business Requirements, Data Center

Number of Positions:

1

Duties:

Senior Engineers You will be responsible for creating innovative interoperability platforms, tools and solutions to enable seamless and secure data integration. Your solutions will be used to connect legacy, newly developed, and vendor applications across the data center and cloud environments, and you will be responsible for the full lifecycle of the solutions:

  • System Requirements
  • Developing specifications / Designing infrastructure and interfaces / Developing code
  • Implementing the platform
  • Helping our customers use it
  • Continuously improving the platform
  • Create and maintain optimal data pipeline architecture / Assemble large, complex data sets that meet business requirements & hands-on coding 80%
  • Identify, design, and implement internal process improvements, automate manual processes, optimize data delivery, and propose re-designing of infrastructure as appropriate to achieve scalability and cost reduction
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS 'big data' technologies
  • Build and/or maintain analytics tools that utilize the data pipeline to provide actionable insights into operational efficiency and other key business performance metrics
Skills:
  • Min to Max +
  • 2 - 8+ years of IT experience / Engineering/Tech Stacks / Frameworks / Programming Languages
  • 2 - 5+ years in data engineering, ETL/ELT, Data Analytics and Reporting
  • 3 - 5+ years of experience programming in a backend language (Java, J2EE, Kafka, Spark, Python, Pyspark, Glue, API etc.)
  • Advanced working SQL and No-SQL knowledge and experience working with a variety of databases (MySQL, NoSQL, Hadoop, MongoDB, etc.)and/or Have good knowledge of SQL, ETL and ELT concepts in Datawarehouse and/or proficiency in SQL, data modelling and data warehouse concepts.
  • Experience in building pipelines to migrate data from on-prem to cloud data repositories (e.g., Snowflake, Redshift, Synapse, Databricks)
  • Experience in ingesting data into cloud repositories from sources including flat files, SQL server, Kafka, CDC and Web APIs
  • Experience with development work to support data migration, cloud applications and related work
  • Experience with Spark, or the Hadoop ecosystem and similar frameworks
  • Strong analytic skills related to working with unstructured datasets
  • Familiarity with various cloud technologies such as AWS (EMR, RDS, Redshift, etc.) and Azure
  • Experienced in designing and developing data pipelines using PySpark in any Public Cloud g. AWS, GCP, Azure etc or hybrid environments, using AWS Glue, Glue Studio, Blueprints etc
  • Proficient in building large scale systems end to end using Java/Python/GoLang or other high-performance languages, developer tools, CI/CD, DevOps, Github, Terraform, monitoring tools and engineering cloud migration solutions
  • Create and maintain optimal data pipeline architecture
  • Assemble large, complex data sets that meet business requirementsand/or build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS 'big data' technologies
  • Hands on Coding80%, code walkthrough, etc
  • Able to develop complex PySpark code using SparkSQL, Dataframes, joins, transposes etc. to load data to an MPP Datawarehouse g. Snowflake.
  • Experience in all facets of software development life cycle like analysis, design, development, data conversion, data security, system integration, and implementation
  • Experience in working with modern IDE's (such as Visual Studio Code, Intellij)
Education:
  • Bachelors' or Masters' degree from an accredited college/university in business-related or technology-related field.
  • Relevant certifications in AWS g. Cloud Practitioner, Solutions Architect Associate
  • Migration and Data Integration strategies/certification
  • Experience with any MPP data warehouses g. Snowflake, Big Query, Redshift etc.

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD2946590
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Bangalore, Karnataka, India
  • Education
    Not mentioned
  • Experience
    Year