Yahoo! Hadoop Product Engineering team is searching for Engineers to join Yahoo to help
build, update, maintain, and monitor our Hadoop open source systems (e.g. Hadoop
Core, Hbase, Storm, Oozie, etc.). These systems consists of tens of thousands of
computers, petabytesof data, spread across multiple datacenters. If you are
looking to work on large scale systems and to learn what BIG Data really means
then Yahoo! is the place for you!
RESPONSIBILITIES
* Invent better ways to manage/automate the administration of new and existing
clusters, via scripting and tools development
* Consider, propose, and implement new methods for system deployment,
monitoring, management and automation
* Collaborate with cross-functional organizations (engineering, qa, site
operations,and security) on new products/feature design and/or diagnosis
of problems with the production systems
* Real-time problem diagnosis/resolution on live systems
* Monitor cluster health and performance, use critical thinking to find areas
for improvement
* Coordinate efforts with your peers in Bangalore
* Document processes, grids, systems, and their associated configurations
* Hardware and facility capacity planning, provisioning new resources
* Implement and adhere to security policies and processes/controls
* Participate in a group pager rotation for emergencies
REQUIREMENTS
* Proven experience as a Unix systems engineer.
* Skilled in monitoring systems such as Nagios, Munin, Icinga, etc.
* Experience in using one or more configuration management systems (Chef/Puppet/Quattor)
* Network skills in DNS, network stack, and subnets.
* Experienced operating large installations of systems.
* Strong shell scripting skills in one or more of Perl, Python, bash etc.
* Experience designing large deployments integrating a variety of hardware
and software technologies
* Demonstrated team experience managing complex production server
environments including direct administration of Unix systems
* Strong interpersonal and communication skills; capable of explaining
simple procedures in writing or verbally
* Experience managing Apache Open Source projects Hadoop Core, Hbase, Storm, and Oozie
PREFERRED JOB QUALIFICATIONS
* BS or MS in Computer Science (or equivalent)
* Minimum 6 years experience as a Unix systems engineer / systems administrator with
at least 2 years in an Internet related business
* Experience with LDAP and Kerberos a plus
* 2+ years experience operating large installations of systems
* Knowledge and work experience with Java, Cluster computing platforms like Hadoop,
Condor, Torque, Maui / Moab, Sun Grid Engine