Hadoop Consultan/BigData Architecture || Bangalore || Notice:Immediate Hiring for MNC 6 - 11 YearsBengaluru Good Day !!!! This is Hameem from DynPro India Pvt Ltd,. We have a requirement for Hadoop Consultan/BigData Architecture for our Deemed Client Bangalore Location. Initially on the payroll of DynPro India Pvt Ltd,. & deployed with our client, Project Duration Initially 24 Months Minimum (Right to hire by client based on performance) Interview: Nice to have from Bangalore (F2F Mandatory) Work Location: Bangalore, EGL, Domlur. Experience: 6 to 13 Years Positions: 2 Salary: As per the market standards Notice: Immediate to 15 Days JD:- Responsibilities 1. Dedicated Hadoop Senior Consultant to assist with Hortonworks Data Platform (HDP) and Hortonworks Data Flow Technical Guidance & Architecture 2. Contribute to HDP/HDF cluster architecture design review, validation & performance optimization as required 3. Contribute to new product validation & implementation HDP Security guidance Best practice recommendations & ImplementationData Ingestion, Data Access & Data storageApplication Deployment, Disaster recovery: associated architecture; data replication; RTO/RPO4. Platform ManagementHDP/HDF Platform managementAdministration and monitoring recommendations & implementationAssistance on upgrade and versioning; best practices (rolling upgrade, cold/hot upgrade)Data Governance, IngestionData Governance and Data quality using HDP components and integration Data Ingestion into existing Hortonworks Data Platform using HDF and other HDP components (Kafka, flow management)Hadoop architecture expertise, will work with Customer s functional teams for Hadoop design best practices as required Knowledge Transfer (KT) 5. Ensuring availability and reliability of Data & Analytics systems 6. Responsible for implementation and ongoing administration of HDP infrastructure Experience 1. Strong Knowledge of BigData Architecture, BigData Cluster and BigData Administrator's role 2. Proficiency with Big Data processing technologies HortonWorks Hadoop and tools such as Sqoop, hive, Flume, etc.. 3. Configuration and Performance Tuning of BigData Cluster 4. Manage, Maintain, Monitor and Troubleshoot a BigData Cluster 5. Data Ingestion, Data Access & Data storage6. Application Deployment and Disaster recovery: associated architecture; data replication; RTO/RPO7. Experience in platform upgrades 8. Cluster Connectivity, Security, Backup and Recovery 9. Experience setting up optimum cluster configurations for MapReduce, Spark, Hive, Hbase, etc. 10. Good Scripting knowledge in UNIX Shell/ Perl11. Excellent communication skills 12. Ability to multi- task and work in fast- paced environment, and manage multiple priorities. 13. Ability to learn quickly and retain knowledge 14. Ability to work independently with a strong attention to detail 15. Ability to work well in a global team environment Notice: Immediate to 15 days Max. not more then that. If you would like to apply for these opening, please provide us below requested details along with your updated word format CV for further processing:- Current CTC: Expected CTC: Employee Type(Permanent/Contract): Notice Period with Current Company: Current Location: Preferred Location: Willing to take interview in weekdays (Y/N): Thanks & Regards Hameem Phn: 080 46725021 firstname.lastname@example.org Sr. Talent Acquisition, Dynpro India Pvt Ltd,.Bangalore Thank you for your valuable time.
Education: Any Graduate
Functional Area: IT/Telecom - Software
Industry: Software Services