Posted on Jan 4, 2021 by Gazelle Global Consulting
Data Engineer - HDFS
An international business is currently looking for a Data Engineer to join their team in Amsterdam on a contract basis.
As Data Engineer you will be involved in an Enterprise level project looking at their HDFS platform, performing optimisation and providing general Data Engineering duties.
The successful Data Engineer will need to have good understanding of data engineering concepts such as data flows and pipelines, Datamodelling and mapping, and strong knowledge of SQL.
An understanding of the HDFS platform and related tools such as Spark, Hive, Kerberos and Hbase will be crucial, and the ideal Data Engineer will also have knowledge of Kafka, Scala, and knowledge of ETL (preferably Talend Studio Platform). You will also have a good understanding of Data Warehousing concepts.
If this sounds of interest then please get in touch at your earliest convenience for more details with an up to date CV to the contact information below.