Experience:
3-5 years of experience
Concerned about your lack of experience? Learn More...
Employment Type:
Full time
Posted:
1/8/2018
Job Category:
Software Development
Hadoop Administrator
(This job is no longer available)
loading
School
Major
Grad Date
 
 

Not sure what types of jobs you are interested in?


Explore Jobs
Based on Your Education

Follow This Company
Share

Job Description

Looking for an Hadoop Administrator with Hortonwork Data Platfor, Cluster, connectivity, python, Java, Spark Cloud AWS, Kafka, Git Jenkins Ansible.

This is a systems administrator spot.

As a Hadoop Administrator, you will be responsible for the design, implementation, Security and on-going support of Arity's Hortonworks Hadoop Distribution on AWS. You will partner with Data Scientists as well as product stakeholders to create an advanced analytics framework.

Key Responsibilities

  • Manage large scale multi-tenant Hadoop cluster environments residing on AWS.
  • Handle all Hadoop environment builds, including design, security, capacity planning, cluster setup, performance tuning and ongoing monitoring.
  • Perform high-level, day-to-day operational maintenance, support, and upgrades for the Hadoop Cluster.
  • Research and recommend innovative, and where possible, automated approaches for system administration tasks.
  • Creation of key performance metrics, measuring the utilization, performance and overall health of the cluster.
  • Deploy new/upgraded hardware and software releases and establish proper communication channels.
  • Work with appropriate stakeholders to ensure we have solid capacity planning and can manage our TCO.
  • Ability to collaborate with product managers, lead engineers and data scientists on all facets of the Hadoop Eco-System.
  • Ensure existing data/information assets are secure and adhering to a best in class security model.

Job Qualifications

Possess at least 3 years of managing a multi-tenant production Hadoop environment;

  • A deep understanding of Hadoop internals, design principals, cluster connectivity, security and the factors that affect distributed system performance;
  • Proven experience with identifying and resolving hardware and software related issues;
  • Experience with Hortonworks DataPlatform;
  • Knowledge of best practices related to security, performance, and disaster recovery;
  • Expert experience with at least two of the following languages; SQL, Python, Java, Scala, Spark or Bash.

Additional Experiences Preferred

  • Experience managing Cloud Services (IaaS, PaaS). AWS Certification is preferred;
  • Experience with a number of different Real Time integration tools such as Apache Flink, Kafka, Storm or Flume. Leveraging Hortonworks DataFlow is preferred;
  • Experience with Git, Jenkins, Ansible a plus
  • Experience with complex networking infrastructure including Firewalls, VLANs, and load balancers.
  • Experience as a DBA or Linux Admin

Qualifications

Applicants must be eligible to work in the specified location