Posted on Apr 28, 2021 by Michael Bailey Associates - Zurich
Michael Bailey Associates are currently looking for a Data Engineer for a Pharmaceutical client based in Basel.
This is a 12 months project with the possibility of getting an extension.
Senior Data Engineer has the responsibility to design data pipeline architecture in a strong partnership with the Product owner to fit with product roadmap needs.
Also, the person will be responsible for expanding and optimizing data and data pipeline architecture, as well as optimizing data flow and transformations.
- Responsible for defining how to develop the pipeline and his/her overall quality
- Help the team in building new and relevant data engineer features for enhancing the pipeline according to the roadmap.
- Plays a mentorship role to ensure the optimal application of best practices, coding conventions over the pipeline and assist data engineer in their duties upon request
- Helps the team to achieve and surpass product and program goals
- Provides technical guidance and direction as well as contributing hands-on to pipeline development
- Acts as an evangelist and guiding practitioner of agile and DevOps practices
- Responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and transformations
- Experience building large-scale software systems and implementation of creative solutions to difficult computational problems (with emphasis on performance and near Real Time data analytics)
- Senior level data engineering (eg ETL) experience
- Databricks or Spark experience
- Proficient in Python, R or Scala
- Experience with AWS/Azure cloud technologies and stack
- Experience building big data applications and pipelines using Spark
- Strong knowledge in DevOps, coding conventions and best practices for User Acceptance Test
- Self-motivated, curious, proactive and accountable
- Solid communication skills
Other specific requirements could be
- Experience in medical imaging solutions
- Understanding of data42 data capabilities and tools
- Understanding of images storage best practices
- Technical ability to configure images systems to load and transfer files
- Knowledge of medical image formats and metadata
- Work closely with data scientists to solve technical issues in building analytical blueprints in ML, DL, and genomic sequence analysis
- Understanding of ML/DL and imaging processing in big data computing engine (eg Spark)
- Familiarity with performance tuning skills (Spark, Data I/O); good knowledge about the internals of Spark and other components in the Hadoop ecosystem (HDFS, Hive)
Michael Bailey International is acting as an Employment Business in relation to this vacancy.