DevOps Engineer SQL Batch/Shell Scripting ETL SSIS SSRS Genève - Suiss
Posted on Jan 10, 2021 by McCabe & Barton
DevOps Engineer SQL Batch/Shell Scripting ETL SSIS SSRS Genève - Suisse Switzerland
We are looking for a DevOps engineer with SQL Batch/Shell Scripting ETL SSIS SSRS.
If you have python and/or machine learning this job is made for you!
We are looking for candidates in the Genève area and/or frontaliers.
We are looking for a DevOps Engineer with a focus on and a typical day might include any of the following:
- Managing schedulers (Jenkins) to extract data from production databases
- Batch, Shell Scripting to move data within the data warehouse
- Overseeing log-files to ensure data reporting jobs (currently c.40 per month) are running smoothly
- Designing new work intake pipeline with business and Data Science teams to deliver new data reports
- Jointly, with Data Scientists, designing automated communication strategies for internal and external stakeholders
MUST be Competent with:
- Understanding Data Scientists' requirements for organisation and accessing data
- Batch and/or Shell Scripting
- SQL Server database tasks such as backup, restore, encryption
- Running ETL processes, based on SSIS and SSRS currently
- Following best practices in data movement such as logs and error handling
- Maintaining a consistent quality of work and rigorous attention to detail
- Ability to working collaborative across teams such as Developers, IT and Data Science
Extra's that would awesome:
- Python skills
- Exposure, or willingness to learn, machine learning techniques
- Prior experience of managing repository of machine learning models
Summary You will be joining a well-established, cutting edge DevOps team within a multi-national fintech. This position would work primarily with the company's fast growing Data Science team to scale up the existing ETL pipelines, automate end-to-end data science prototyping and communications and catalogue the model-store of machine learning and deep learning models. You would also be responsible for managing the day-to-day performance of an internal data warehouse.