Data Engineer (Azure Databricks/Synapse)
Posted on Jun 5, 2021 by SuisseCo GmbH
SuisseCo specializes in the international recruitment and placement of highly qualified IT specialists in Switzerland. We support our clients in the implementation of their IT projects and guarantee quick and flexible solutions of the highest quality.
For one of our clients in the financial industry we are currently looking for a Senior Data Engineer (Lakehouse with Databricks/Synapse)
The responsibility of the Data Engineer is to build a Lakehouse setup on Azure for two strategic projects in Finance and Risk Management. Hands-on experience in building a Lakehouse using Data Lake, Databricks and Synapse on Azure is required.
- 3+ years of experience in data engineering application development using Spark, preferably in an enterprise environment
- Review existing solutions and their stages in the different data zones (bronze, silver, gold)
- Suggest the architecture and implement the Lakehouse according to the requirements of the two projects
- Benchmark which load to do with either Databricks or Synapse
- Evaluate optimal VM infrastructure for the respective Databricks workload
- Implement productive data pipelines with the logic of existing Databricks notebooks
- Implement optimal setup and tuning of Delta files and its handling using Python and SQL
- Define and implement the interplay of ADF, Data Lake, Databricks, Synapse up to Power BI
Essentials Skills and Qualifications
- Ability to work as part of a team, as well as work independently or with minimal direction.
- Excellent written, presentation, and verbal communication skills.
- Collaborate with data architects, modelers and IT team members on project goals.
- Strong knowledge of Lakehouse components such as ADF, Data Lake, Databricks, Synapse
- Good knowledge of Power BI, SQL, Python
Please note that for this role, relocation to Switzerland is required. We are only able to hire EU-27 citizens or candidates with an existing Swiss working permit/citizenship.