Big Data Architect
Role : Big Data Architect
Employment Type: Permanent /Subcon
Location : Sydney
One of our large client is hiring Big Data Architect to work on large project @ Sydney.
Big Data Architect:
- Hands-on experience in Cloudera including prior experience in Cloudera upgrade and cluster management using Cloudera Manager
- Experience in HDFS, MapReduce, Hive, Pig, Sqoop, Oozie, NoSQL/HBase, Yarn, Spark, Scala, Python
- Good knowledge of Linux.
- Experience in Kafka configuration, Flume configuration and deployment
- Hands-on experience in Backup, Restore, Cluster set-up, Performance tuning of the cluster, Alert and monitoring of any failures, Troubleshooting, Production deployment of code from non-prod to prod, Capacity planning, Scaling and administration (node addition, decommissioning/recommission, load balancing), etc
- Experience in HDFS support and maintenance.
- Experience in Configuring Cloudera services
- Experience in managing user accounts and configuration of external authentication.
- Experience in installing Software patches and upgrades
How to apply:
Please apply using the link below or call on 03 8506 6546. Only short-listed candidates will be contacted.
Adaps is an equal opportunity employer that actively embraces diversity in its workforce through accurate community representation of gender, culture, thought and work arrangements.
Connect with Adaps: