Posted on Sep 8, 2021 by Berkeley Square IT Ltd
€600 per day
Do you want to work for an international global institute who have work all over Europe?
My client is looking for a Fluent English-speaking Data Engineer to join their ongoing project in Brussels.
This is an opportunity to work for an exciting and dynamic company working on a long-term major project in an international, multilingual environment where you can add a real presence to the team whilst the role offers a fantastic package.
- Design and implement the data engineering solutions
- Design and implement Datamodelling, interfaces definition, technical and functional documentation
- Design, analyse, and implement Business Intelligence components such as Data Warehouses, Data Lakes, Data Marts, ETLs, Reporting Tools, Analytics, Data Mining Tools
- Design, create, test and maintain data pipelines to support analytics project
- Excellent Python and/or R Development skills (min 8 years);
- Good experience working on data-driven projects
- Design and implement data cleansing, data preparation or other type's data processing solutions using the available tools eg SAS, Oracle or other tools
- Write code, debug, test and improve
- Work in an agile/iterative way eg participate in drafting the backlog, user stories, definition of the sprints
- Follow up, coordinate and facilitate all data manipulation activities and initiatives
- Collect business requirements and interact with the business users
- Provide support and training on data manipulation and processing to the different stakeholders
- Draft and/or review technical requirements for a data analytics environment, both in terms of hardware and software and creation of functional architecture for a data analytics environment;
- Experience in effort estimation required for execution of the requested activities
- Draft progress and technical reports
- Follow-up on IT delivery, including the activities of external contractors
- Signal risks and propose measures to mitigate those risks
- Coordinate the stakeholders eg business users, infrastructure team.
- Minimum of 13 years of professional experience
- Minimum 3 years of experience with Python and R
- Minimum 8 years of cumulative experience with big data processing in following languages Python and/or R (eg 3 years with Python, and 5 years with R)
- A background in statistics and experience with the SAS statistical functions are considered to be an asset
- Minimum 3 years of experience with the usage of the Python, R libraries dedicated for the data manipulation (eg NumPy, pandas, scikit-learn, Matplotlib) and with the Anaconda distribution of Python, R
- Expert knowledge of database programming & Datamodelling and of Oracle database programming and administration supported by a minimum experience of 5 years in this area
- SAS programming language knowledge and experience in this area or any SAS certification are considered an asset
- Minimum 3 years of experience with distributed data stores (Redis, Hadoop, Elasticsearch, Kafka)
- Expert knowledge on Data Warehousing techniques, designs and implementation supported by a minimum experience of 5 years in this area. A Data Warehousing and Data Base programming certification shall be considered an asset
- Expert knowledge on identifying performance bottlenecks and performance tuning when working with big data, in the database and the non-database layer, supported by a minimum experience of 5 years in this area
- Experience in the optimization and tuning of the visualisation tools (SAS VA, Tableau) and code for better performance will be considered as additional asset
- Minimum 5 years of experience in implementing projects in an agile way
- Minimum 3 years of experience in estimating the hardware requirements needed to ensure the performance requirements of big data project. In estimating the hardware requirements needed to ensure the performance requirements of big data projects. In requirements from the business users, preparing test data and executing test cases interacting with the business users and in working on projects involving a large number of stakeholders
- Ability to draft functional, technical documents and reports and to work fluently in English is mandatory
- Knowledge and experience with STATA and experience with Atlassian tools eg Jira, Confluence, BitBucket is considered to be an asset
- Experience with Machine and Deep Learning in Python, R and with non-relational databases are considered an asset
- Knowledge and experience with the SAS tools are considered an asset eg SAS Enterprise Guide, SAS Visual Analytics, SAS Data Integration Studio supported by a minimum of 5 years' experience in at least 3 SAS tools eg SAS Viya, SAS Enterprise Guide and SAS Visual Analytics
This is an opportunity to work for an exciting and dynamic Company working on a long-term major project in an International, multilingual environment where you can add a real presence to the team whilst the role offers a fantastic package.
Interviews being held very quickly, so if you are interested - apply or send your CV or call (see below).