Sound working experience in AWS cloud tech stack and Snowflake
Coordinate closely with the Business teams for data integration and alignment
Design,Develop, Maintain, support, and enhance the business intelligence data backend, including data warehouse and datalakes
Ability to assess the current environment , requirements and propose the Data Engineering Solutions
Adapt quickly to change in requirements and be willing to work with different technologies if required
Ability to build frameworks and pipelines using AWS PySpark and Glue
Able to enhance various Extraction, Ingestion and Enriched Frameworks that are already built using PySpark and Glue
Able to build the logic by reading the data from AWS S3 and apply the transformations using Python
Able to understand the existing AWS Lambda, SNS and SQS Events
Able to work with platform team in problem solving and troubleshooting activities.
Propose Standard and Best Practices by leveraging AWS Services
Apply creative thinking/approach to arrive at technical solutions that enhance business goals and align with corporate technology strategies.
Help Application teams in onboarding the feeds in to the built-in frameworks
Make sure to follow best design practices during the implementation of ETL Frameworks
Develop prototypes, technical proof of concepts and code with various solutions and convert platform architecture blueprints to real life working artifacts
Qualification:
Bachelor's or Master's Degree in Computer Science or equivalent experience
At least 8 years of experience in application design, development and analysis
Experience in AWS Cloud Solutions. Retail Industry experience preferred.
Implement data transformations and data structures for data warehouse and lake/repository.
Manage cloud and/or on-premises solutions for data transfer and storage.
Process data using spark (PySpark)
Manage cloud and/or on-premises solutions for data transfer and storage.
Collaborate and work with data analysts in various functions to ensure that data meets their reporting and analysis needs.
Experience in creating Various ETL Frameworks in processing or extracting the data from cloud databases by using AWS Services by leveraging Lambda, Glue, PySpark, Step Functions, SNS, SQS and Batch.
Proven ability to be a strategic thinker to drive the necessary ownership and data governance within the organization
Ability to evaluate risks and provide business recommendations / solutions in a timely manner
Ability to remain calm in high stress situations while navigating and leading during times of conflict
Should have sound knowledge in various AWS Services
Able to understand the existing Frameworks built in AWS PySpark and Glue
Able to scan ETL Frameworks and propose optimizations and cost savings.
Able to work independently and able to take key decisions.
Comfortable interacting with stakeholders at all levels through Business and IT.
Ability to express complex concepts effectively, both verbally and in writing to IT & Business partners
Understanding of and experience working in an Agile environment
Knowledge on Talend Cloud ETL,Kafka, Snowflake Cloud and PowerBI is addon
Independent. Strong critical thinking, decision making, troubleshooting and problem-solving skills.
Go-Getter. Possesses strong, planning, execution and multitasking skills and demonstrated ability to reprioritize accordingly. Must be able to manage quickly changing priorities while meeting deadlines.
Job Type: Full-time
Pay: ?1.00 - ?3,000,000.00 per year
Work Location: In person
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.