Search by job, company or skills
Responsibilities:
. Required to get hands-on where needed and work along the technical team
. Support the Hadoop Platform with the tools OOZIE, Apache Ni Fi, Java & Unix must knowledge
. Well-versed knowledge on Network, Storage and API's creation and debugging
. Knowledge on supporting the applications on Big data platform and Debugging knowledge of YARN
. Write/debug Python code for the designed frameworks
. Control the Hadoop clusters with proper access controls and apply the security with the tools Ranger, Kerberos
. Coordinate with product vendors to setup/configure infrastructure, network, nodes
. Write automation scripts for any alerts and monitoring dashboards from automation perspective with Python
. Maintaining and troubleshooting the Environment with Ambari
. Integrate Solution knowledge between Hadoop and Reporting tools like Tableau, MicroStrategy, SAS, HDF
. Solutioning different Disaster Recovery approaches for Bigdata
. Tuning and scaling up clusters with several proposals
. Migrating the HDP clusters to CDP cluster or to the Azure cloud
Skillset:
. Hadoop Hortonworks admin or support L3 Knowledge
. Cloudera data platform admin
. Hive /Hbase SQL Tuning for map reducer jobs
. Shell/ Python Scripting experience
. Certification CDP Administration
. Certification RedHat Linux Administration added advantage
. MySQL/postgreSQL knowledge
. VMware Virtualization knowledge preferred.
. Knowledge on Pure and Isilon storage, S3 file systems is added-advantage
. Must be able to engage with technical people but also with semi-technical senior stakeholders and customer's application teams
Years of Experience:
7 to 12 years overall Experience and 5+ Years of Big data Engineering and Production support
Role:Other Production/Engineering/R&D
Industry:Other
Function:Manufacturing/Engineering/R&D
Job Type:Permanent Job
Date Posted: 19/11/2024
Job ID: 100816289