Implement and support Hadoop infrastructure on an Oracle Big Data Appliance
Implement and support our enterprise security standards on a Hadoop cluster
Propose and deploy new software environments required for Hadoop and expand existing environments
Set up new Hadoop users. This includes setting up and testing HDFS, Hive, Pig and MapReduce access for the new users
Tune performance of Hadoop clusters and Hadoop MapReduce or Spark jobs
Monitor Hadoop cluster job performance
Plan Hadoop cluster capacity
Manage and review Hadoop log files
Manage and monitor cluster file system
Work with the infrastructure, network, database, application and business intelligence teams to guarantee high availability
Install operating system and Hadoop updates, patches and version upgrades when required
Work with developers to evaluate their Hadoop use cases, provide feedback and guidance
Escalate support issues with vendors
Development activities as required by Big Data project specifications working with Python, Apache Spark and/or Java / Scala
Manage system Backups and DR plans
KNOWLEDGE AND SKILLS REQUIRED:
Ability to proactively identify, troubleshoot and resolve live systems issues
Understanding of system capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks.
Ability to Install operating system and Hadoop updates, patches and version upgrades when required
Hadoop development skills including HBase, Hive, Pig, Mahout, etc.
Ability to deploy a Hadoop infrastructure, add/remove cluster nodes and install software
Ability to schedule, configure and keep track of jobs, monitor critical parts of the cluster, configure name-node high availability,
Ability to Manage System Backups and DR plans
Strong analytical and problem-solving skills with ability to clearly articulate solution alternatives.
Exceptional interpersonal skills to communicate both internally and externally and a team player.
Strong understanding of Hadoop design principals, cluster connectivity, security and the factors that affect distributed system performance
Strong knowledge and experience in supporting Linux environments
Flexible, open to suggestions, and eager to learn or share knowledge.
Orientation toward self-motivation, organization and attention to detail.
Ability to prioritize and work on multiple projects
KNOWLEDGE AND SKILLS PREFERRED:
Administration activities on an Oracle Big Data Appliance
Troubleshooting Core Java Applications
Impala, Python, Apache Spark and/or Java / Scala
EDUCATION AND EXPERIENCE REQUIRED:
BS degree in Computer Science or a related field
Min 3 years experience with Hadoop and related technology stack
Maintaining, troubleshooting and setting up large clusters
Supporting systems with 24X7 availability and monitoring
EDUCATION AND EXPERIENCE PREFERRED:
Implementing security on Hadoop (HDFS encryption, Kerberos and LDAP integration and Apache Ranger/Knox)
Performance tuning in Big Data environments.
Cloudera distribution of Hadoop
Responsible for implementation and ongoing administration of a Hadoop infrastructure on an Oracle Big Data Appliance, working with the infrastructure, network, database, application and business intelligence teams to guarantee high availability, implementing and supporting enterprise security standards on a Hadoop cluster and performance tuning of Hadoop clusters and Hadoop MapReduce or Spark routines.
At AdventHealth, Extending the Healing Ministry of Christ is our mission. It calls us to be His hands and feet in helping people feel whole. Our story is one of hope — one that strives to heal and restore the body, mind and spirit. Our more than 80,000 skilled and compassionate caregivers in hospitals, physician practices, outpatient clinics, urgent care centers, skilled nursing facilities, home health agencies and hospice centers are committed to providing individualized, wholistic care.