This Job Vacancy has Expired!

Big Data Engineer - Python & (Kafka - Nifi - HDFS)

Experis AG

Posted on Sep 8, 2021 by Experis AG

Zürich, Switzerland
IT
Immediate Start
Annual Salary
Contract/Project



Big Data Engineer - Python & (Kafka - Nifi - HDFS)


Experis is the global leader in professional resourcing and project-based workforce solutions. Our suite of services ranges from interim and permanent recruitment to managed services and consulting, enabling businesses to achieve their goals. We accelerate organisational growth by attracting, assessing, and placing specialised professional talent.


Overview of business area or project:


This role will be part of Cyber Security Analytics Data Pump project team


The mission of the team includes:



  • Designing and developing robust, high-volume, streaming security data pipeline services

  • Documenting, supporting and maintaining the services


Key Responsibilities



  • Design, develop and test big data solutions

  • Create code and configuration to implement product use cases

  • Analyze and integrate data from disparate sources

  • Analyze and optimize application performance

  • Evaluate and identify candidate technologies to meet business needs

  • Document test results, designs, and operational manuals for the solutions

  • Perform code upgrades

  • Identify and document best practices within their subject matter areas of expertise

  • Provide continuous process improvement suggestions.

  • Provide support for production operations

  • Occasional on-duty or after business hours work in case of incident support


Essentials Skills and Qualifications:


At least 4 years of experience in Big Data engineering and administration



  1. Planning, implementing, tuning and scaling big data pipelines from the hardware up.

  2. Strong experience with some or all of the following: Kafka, Nifi, HDFS, Spark Streaming, Flink

  3. Experience with one or more of the following programming/Scripting languages: Python (preferred), Java, Scala, bash

  4. Ability to operate comfortably in Linux environments.

  5. Experience with Web APIs, automation and data-related technologies: preferably with REST, JSON, Syslog, TCP, XML, Avro

  6. Ability to work with network, data center, and infrastructure teams to optimize the application environment

  7. Bonus for experience or knowledge of any of the following: Splunk, Salt, MS Azure, Amazon AWS/S3

  8. Fluency in English is mandatory


Desired Skills and Qualifications:



  • Proficient in writing and using REST and SOAP APIs

  • Experience with containers, container management (Docker, Kubernetes, Linux Containers)

  • Experience working in a DevOps/SRE team, applying Agile methodology

  • Experience with assessing incidents criticality and escalation processes

  • Experience with writing technical documentation for internal audiences.


Interested in this opportunity? Kindly send us your CV today through the link in the advert. However, should you have any questions please contact Danny Besse.






Reference: 1313840537

Set up alerts to get notified of new vacancies.