This Job Vacancy has Expired!

Big Data Engineer

Posted on Aug 27, 2020 by Randstad (Schweiz) AG

Zürich, Switzerland
1 Oct 2020
Annual Salary

We are currently supporting a leading Swiss bank in Zurich, in their search for a Big Data Engineer

Overview of the business areas/project:

Within the Global Markets business area, Execution and Agency Products is responsible for all Listed and Primary Market security execution for clients, predominantly covering Wealth Management and Private clients in Switzerland.

The project aims to build a scalable platform for processing streaming and end-of-day data, enriching it further, extracting business metrics and presenting the data to the users using off-the-shelf dashboard solutions.

Key Responsibilities

  • Develop software components responsible for sourcing intraday and end-of-day data, processing this data further and extracting analytical insights from this data
  • Develop monitoring capabilities for the data processing pipelines in Production
  • Actively participate in the design and technology review of the software components developed in the team
  • Evolve overall architecture of the solution with the use of latest technologies available in the bank
  • Work to streamline development process and to improve software performance
  • Contribute to integration testing (automated and manual) efforts as required
  • Collaborate with platform management and other team members on the requirements, preparing the releases and delivering the applications to production
  • Assist to resolve incidents involving Production system (3rd level support)

The successful candidate will be working in the Execution and Agency Products development organization with a development team of approximately 60+ people. This role is part of a global team with members in multiple countries

The main challenges will be:

  • to provide robust solutions for the volumes of data the software is expected to process
  • to develop software components that meet business requirements
  • to contribute to re-architecture of software landscape in the division
  • to develop appropriate software solutions despite challenging timeframes
  • to comply with policies and standards typical of a large organization

Essentials Skills and Qualifications:

  • Deep expertise in developing data-intensive software solutions using Python and Scala
  • Extensive experience with Big Data technologies, such as Hadoop, YARN, Spark, Impala, Kudu, NiFi, Oozie
  • Practical experience with major Hadoop distributions (Cloudera, Hortonworks, Databricks); good knowledge of individual components; ability to troubleshoot issues in Production
  • Experience in building data streaming pipelines using Apache Spark or Apache Flink (ideally both)
  • Experience with Datamodelling and SQL query language
  • Good understanding of security concepts
  • Experience in successfully integrating emerging open-source technologies and ability to shape the solution architecture

Desired Skills and Qualifications:

  • Experience with Elastic technology stack (Elastic/Kibana)
  • Practical experience with Apache Druid/Apache Ignite
  • Experience with automated provisioning and configuration of open-source distributed systems
  • Exposure to Kubernetes

Ready to take on this new challenge in your career?

I look forward to receiving your applocation

Reference: 936627116

Set up alerts to get notified of new vacancies.