Location: Indore, Bangalore, Gurgaon, Noida, Hyderabad and Pune Experience: 1- 5 years Description: We are looking for Data Engineering having minimum experience of 2 year in AWS and Pyspark, passionate about technology, motivated for continuous learning and an individual who views every client interaction as an opportunity to create an exceptional customer experience. Qualifications : Must have: BE/B.Tech/MCA/MS-IT/CS/B.Sc/BCA or any other degrees in related fields Expertise and hands-on experience on AWS, Pyspark, Airflow, Hadoop and SQL database. Job description : 1-5 years of good hands on exposure with Big Data technologies - pySpark (Data frame and SparkSQL), Hadoop, and Hive Good hands-on experience of python and Bash Scripts Able to write Glue jobs. Hands-on experience with using Cloud Platform provided Big Data technologies (i.e. Glue, EMR, RedShift, S3, Kinesis) Good understanding of SQL and data warehouse tools like (Redshift) Strong analytical, problem-solving, data analysis and research skills Demonstrable ability to think outside of the box and not be dependent on readily available tools Excellent communication, presentation and interpersonal skills are a must Able to implement business using Python and pySpark. Writing python scripts to support the solution. Participate in code peer reviews to ensure our applications comply with best practices. Provide estimates for development Task. Can perform integration testing of developed infrastructure. Good to have: Orchestration with Airflow and Any job scheduler experience Experience in migrating workload from on-premise to cloud and cloud to cloud migrations Mandatory Skills Bigdata, pyspark ,airflow, aws, glue. Please share resume at [HIDDEN TEXT]
foundit
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.