This Job Vacancy has Expired!

Hadoop Developer-Streaming and ingestion (inside IR35)

Silicon Logic

Posted on Aug 26, 2021 by Silicon Logic

London, United Kingdom
IT
Immediate Start
Daily Salary
Contract/Project


  • Spark Streaming Configuration and integration with Kafka.
  • Experience with API development and troubleshooting is a must.
  • Experience working in Cloudera DataPlatform (CDP) is good to have.
  • Kafka cluster configuration with partitions and consumers.
  • Minimum 1 Spark streaming and Kafka application deployment experience required
  • Experience in Streamsets configuration and pipeline creation/copying
  • Experience in integrating Messaging queues with Kafka/Elasticsearch clusters within Streamsets
  • Experience in Cassandra/Elasticsearch integration with Spark application
  • Spark jobs tuning and troubleshoot experience.
  • Experience in creating sqoop jobs in ingesting data from various RDBMS systems, FTP files.
  • Airflow scheduler experience is must, and exposure on Apache Nifi is good.
  • Hadoop, Hive, Impala and Kudu experience is a must with debugging skills.
  • Experience on Azure DataLake Storage and deploying pipelines in Data Factory is a must.
  • Python and Hadoop Yarn experience is must to have.
  • Support & maintain the application and troubleshoot for any issues in the module.



Reference: 1300037247

Set up alerts to get notified of new vacancies.