Responsibilities : A day in the life of an Infoscion-
As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment
You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs
You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements.
You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives
You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers.
If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities :
Knowledge of design principles and fundamentals of architecture
Understanding of performance engineering
Knowledge of quality processes and estimation techniques
Basic understanding of project domain
Ability to translate functional / nonfunctional requirements to systems requirements
Ability to design and code complex programs
Ability to write test cases and scenarios based on the specifications
Good understanding of SDLC and agile methodologies
Awareness of latest technologies and trends
Logical thinking and problem-solving skills along with an ability to collaborate
Technical and Professional Requirements :
Minimum 3 years of experience in Hadoop Administration working in production support projects
Experience in Installing, configuring, maintaining, troubleshooting and monitoring Hadoop clusters and components such as HDFS, HBase, Hive, Sentry, Hue, Yarn, Sqoop, Spark, Oozie, ZooKeeper, Flume, Solr
Experience in Installing, configuring, maintaining, troubleshooting and monitoring of Analytical tools such as Datameer, Paxata, DataRobot, H2O, MRS, Python, R-Studio, SAS, Dataiku -Bluedata and Integrating with Hadoop
Very good at Job level troubleshooting (Yarn, Impala and other components) (Must)
Experience and Strong Knowledge of Unix/Linux, scripting (Must)
Experience and knowledge on tools such as Talend, MySQl Galera, Pepperdata, Autowatch, Netbackup, Solix, UDeploy, RLM
Troubleshoot development and production application problems across multiple environments and operating platforms
Preferred Skills : Technology->Big Data - Hadoop->Hadoop Educational Requirements : MSc,BSc,BCom,MBA,MTech,BTech,Master Of Engineering,Bachelor of Engineering Service Line : Enterprise Package Application Services
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.
Job Detail
Job Id
JD3590998
Industry
Not mentioned
Total Positions
1
Job Type:
Full Time
Salary:
Not mentioned
Employment Status
Permanent
Job Location
Bangalore, Karnataka, India
Education
Not mentioned
Experience
Year
Apply For This Job
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.