Job Description
- Devoted Hadoop Administrator to help with Technical Guidance and Architecture on Hortonworks Data Platform (HDP) and Hortonworks Data Flow (HDF)
- Commitment for HDP/HDF Cluster Planning, Architecture Design, Review, Validation and Performance Optimization as required
- Commitment for HDP Security execution with Kerberos, Knox, Ranger, KMS, and so on.
- Supporting group on Data Ingestion into existing HDP utilizing HDF and other HDP parts (Kafka, Storm, Spark, and so on.)
- Supporting group on Data Governance and Data Quality utilizing HDP Components and Integration
- Application Deployment, Data Ingestion, Data Storage and Data Access Authorization (Sqoop, PIG, Hive, HBase, and so on.)
- HDP/HDF Platform Administration and checking (Ganglia, Nagios, Splunk Dashboards and so on) suggestions and usage
- Guaranteeing Availability and Reliability of Data and Analytics Systems
- Help on Cluster Upgrade and Versioning; best works on (Rolling and Express Upgrades)
- Catastrophe Recovery: Associated Architecture; Data Replication; RTO/RPO
- Need to work with Customers utilitarian groups for Hadoop configuration best practices as required
- Add to new Product approval, usage and Knowledge Transfer (KT)
Profile Mandatory Skills/Experience
- Solid Knowledge of Bigdata Architecture and Administrator's job
- Mastery in introducing Hadoop Cluster and its related parts
- Arrangement and Performance Tuning of Hadoop Cluster
- Oversee, Maintain, Monitor and Troubleshoot Hadoop Clusters
- Information Ingestion, Data Access and Data stockpiling
- Application Deployment and Disaster Recovery
- Involvement in Hadoop Upgrade
- Bunch Connectivity, Security, Backup and Disaster Recovery
- Great Scripting learning in UNIX Shell/Perl
- Magnificent relational abilities
Company Profile:
Salary: Not disclosed
Industry: Banking / Financial services / Broking
Functional Area: Web Technologies
Role Category:
Employment Type: Full time
Keyskills