This Job Vacancy has Expired!

Big Data Engineer


Posted on Apr 14, 2021 by Swisslinx

Basel, Switzerland
Immediate Start
Annual Salary

As one of the top suppliers to our client, a prestigious bank in Basel, Swisslinx are looking for a highly motivated and technology savvy Big Data Engineer with 5+ years experience in the Big Data space to help implement our next generation data and analytics systems.

This is a rolling contract (with possibility to extend for up to five years) role in Basel, Switzerland, starting in no later than July 2021, initially running until 31 March 2022.

Joining a highly professional and motivated team of data and analytics professionals which is growing due to increased demand. You will implement, maintain and support shared data platforms and bespoke analytical systems exploiting cutting-edge technologies and modern software development practices.

The role is to tackle a range of complex software and data challenges, including data management, advanced analytics and business intelligence. This role is pivotal to implement, maintain and support data pipelines using shared data platforms exploiting cutting-edge data technologies and modern software development practices.

Responsibilities include:

Provide implementation support on variety of data management and analytics projects using the Bank's approved databases (big data as well as relational databases) and analytical technologies
Responsible for development and following best practices for big data environment
Develop required code for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and big data' technologies.
Translate, load and present disparate datasets in multiple formats/sources including JSON
Translate functional and technical requirements into detail design
Participate and contribute in overall data architecture and the design of Big Data & Advance Analytics solutions
Designing, developing, constructing, installing, testing and maintaining highly scalable, robust and fault-tolerant data management & processing systems by integrating a variety of programming languages & tools together
Develop end-to-end data pipelines; defining database structure, applying business rules and functional capabilities, security, data retention requirement etc
Work with stakeholders including the business area sponsors, product owners, data architect, data engineers, project managers and business analysts to assist them with their data-related technical issues and support their data needs.
Work with big data solution architect to strive for greater standardisation across various data pipelines
Ensure effective design and development of system architectures through frequent product deliveries; employing effective governance methods for transparency and communication.
Remain up-to-date with industry standards and technological advancements that will improve the quality of your outputs
Propose and implement ways to improve data quality, reliability & efficiency of the whole system.
Interact with the business to identify, capture and analyse business requirements
Develop functional specifications in a team environment, as well as derive use cases where appropriate
Assist and support proof of concepts as Big Data technology evolves
Ensure solutions developed adhere to security and data entitlements
Translate, load and present disparate datasets in multiple formats/sources including JSON
Translate functional and technical requirements into detail design

This is a great chance to work in a highly collaborative environment, not just among the team but with expert economists, technologists, data scientists and statisticians - and counterparts in other international organisations and central banks.

We're looking for a candidate who's passionate about data and analytics with the ability to think out of the box who can introduce new ideas into the team.

In order to be considered for this role, you have minimum 5 years' experience as a Big Data Engineer and a broad set of data engineering skills and experience in agile software development methodologies and will possess the as many of the following skills and experience:

Minimum 5+ years' work experience in Big data technology space
Experience in building data ingestion pipelines (simulating ETL or ELT workloads for data warehouse or data lake architecture.
Hands-on development in using open-source big data tools such as Hadoop, Hive, Spark, HBase, Kafka, Impala, ELK etc. preferably with Cloudera Strong experience in Datamodelling, design patterns and building highly scalable Big Data applications.
Experience in designing and developing large-scale distributed applications using HBASE/Hadoop/Spark/Scala.
Experience with relational SQL and NoSQL databases: SQL Server, Postgres, Cassandra, etc.
Experience with data pipeline and workflow management tools: Airflow, RunDeck, Nifi etc
Experience with stream-processing systems: Kafka, Spark-Streaming, etc.
Fluency in English (additional languages a plus)

Please note due to COVID interviews would be conducted remotely, however the role is based 100% in Switzerland. The team is currently working for home during the pandemic and 2-3 days home office will be offered post COVID.

Are you interested to work in an international environment in one of the leading financial companies in Zurich? Then apply now! We look forward to receiving your full application.

By applying for this position, I consent to the Swisslinx Group of companies:
- storing my personal information (including name, contact details, Identification and CV information etc.) on their internal or external Servers for the purpose of informing me of potential employment opportunities
- using my personal information or
- supplying it to third parties upon express consent for the purpose of informing me of potential job opportunities
- transferring where applicable my personal information to a country outside the EEA/EFTA

I also hereby agree to the Swisslinx privacy policy

Reference: 1163438496

Set up alerts to get notified of new vacancies.