DevOps for Cloudera Hadoop
Posted on Apr 17, 2021 by Hadron Finsys GmbH
- Experience with one or more major Hadoop distributions and various ecosystem components (eg HDFS, Sqoop, Impala, Spark, Flume, Kafka, Nifi etc.) is a must while experience with Data Concepts (ETL, near-/Real Time streaming, data structures, metadata and workflow management) would be a big strength.
- Details of Hadoop ecosystem viz. Hadoop, Spark, nodes, clusters, YARN, schedules, encryption etc.
- Good Programming/Scripting Skills (Python, Java, C/C, Scala, Bash, Korn Shell) and understanding of DevOps Tools (Chef, Docker, Puppet, Bamboo, Jenkins)
- Prior hands on Big Data platform development experience in large projects with ability to translate technical designs into working solutions and to build and maintain positive relationships with Architect and Big Data Project teams to work out platform features