You will be a critical member of the InfoCepts Cloud Data Architect Team. This position requires a sound knowledge on Big Data using Cloud technologies such as such as Databricks, EMR, Athena, PySpark, S3, AWS Lambda etc. Strong foundation on database concepts and SQL is also required.
Location:
Nagpur/Pune/ Bangalore/Chennai
Type of Employment:
Full time
Key Result Areas and Activities:
Technology Assessment and Design
Study existing technology landscape and understand current data integration frameworks and do impact assessment for the requirements.
Should be able to design complex Big Data use cases using AWS Services and Databricks, under guidance of Architect.
Ensure optimal balance between cost and performance.
Documentation and Stakeholder Communication
Project documentation, adheres to quality guidelines & schedules.
Works hand in hand with Architect & PM for successful delivery of project and provide estimation, scoping, scheduling assistance.
Articulate design decisions very clearly with stakeholders.
Perform Proof-of-Concepts and document all observations before proposing a new solution.
Conduct design review sessions with other teams and identify scope of improvement.
Process Improvement and Automation
Suggests automation to improve existing processes.
Assist junior Data Engineers by providing expert advice or troubleshooting steps whenever required.
Create technology-focused training plans whenever required.
Deliver technology-focused training sessions with team members whenever required.
Conduct Expert Knowledge Sharing sessions with Client Stakeholders whenever required.
Assist in designing case study documents whenever required.
Work and Technical Experience:
Must-Have:
In-depth knowledge of the following AWS services: S3, EC2, EMR, Athena, AWS Glue, Lambda
Experience with at least one MPP database: AWS Redshift, Snowflake, SingleStore
Proficiency in Big Data technologies: Apache Spark, Databricks
Must have strong programming skills in Python
Responsible for building data pipelines in AWS And Databricks
Experience with Big Data table formats, such as Delta Lake (open source)
Must have very strong SQL skills
Experience with orchestration tools like Apache Airflow
Expertise in developing ETL workflows with complex transformations such as SCD, deduplications, aggregations, etc.
Good to Have:
Cloud Databases - Snowflake, AWS Aurora
Big Data - Hadoop, Hive
Cloud Databases- AWS Aurora
Associate Level or Professional Level AWS Certification or Databricks Certification
Qualifications:
Overall 7+ years of IT Exp
5+ years of relevant experience in AWS related project
Bachelor's degree in computer science, engineering, or related field (master's degree is a plus)
Qualities:
Strong hold technical knowledge and experience.
Should have the capability to deep dive and research in various technical related fields
Self-motivated and focused on delivering outcomes for a fast-growing team and firm.
Prior experience of working in a large media company would be added advantage
Years Of Exp
5 to 7 years
Location
India
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.