Sonsoft Inc

Hadoop Administration (Austin, TX, Charlotte, NC, Chicago, IL, Cincinnati, OH, Cupertino, CA, Foster City, CA, Houston, TX, San Jose, CA, Sunnyvale, CA, Tampa, FL)

Hadoop Administrator at Sonsoft Inc in San Jose, CA. Requires 2+ years in Hadoop, Cloudera, and Linux. Benefits include competitive salary and growth opportunities.

Department - JobBoardly X Webflow Template
Job Level - JobBoardly X Webflow Template
Entry Level
ServiceNow Role Type:
ServiceNow Modules:
Department - JobBoardly X Webflow Template
DevOps
Department - JobBoardly X Webflow Template
Virtual Agent
ServiceNow Certifications (nice to have):

Job description

Date - JobBoardly X Webflow Template
Posted on:
 
June 21, 2017

We are seeking a Hadoop Administration professional with at least 2 years of experience in implementing and administering Hadoop infrastructure. The ideal candidate will have a strong understanding of Hadoop, MapReduce, HBase, Hive, Pig, and Mahout, as well as experience working with Cloudera Manager or Ambari, Ganglia, and Nagios.

Requirements

  • At least 2 years of experience in Implementation and Administration of Hadoop infrastructure
  • At least 2 years of experience in Project life cycle activities on development and maintenance projects
  • Operational expertise in troubleshooting, understanding of system's capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks
  • Hadoop, MapReduce, HBase, Hive, Pig, Mahout
  • Hadoop Administration skills: Experience working in Cloudera Manager or Ambari, Ganglia, Nagios
  • Experience in using Hadoop Schedulers - FIFO, Fair Scheduler, Capacity Scheduler
  • Experience in Job Schedule Management - Oozie or Enterprise Schedulers like Control-M, Tivoli
  • Good knowledge of Linux
  • Exposure to setting up Ad/LDAP/Kerberos Authentication models
  • Familiarity with open source configuration management and deployment tools such as Puppet or Chef and Linux scripting, Autosys
  • Experience in Shell and Perl scripting and exposure to Python
  • Knowledge of Troubleshooting Core Java Applications is a plus
  • Exposure to Real-time Execution engines like Spark, Storm, Kafka
  • Version control Management tools: Subversion or Clearcase or CVS, Github
  • Experience in Service Management Ticketing tools - ServiceNow, Service Manager, Remedy

Requirements Summary

At least 2 years of Hadoop infrastructure experience, 2+ years of IT experience, and a Bachelor's degree