Full Stack Developer, Engineer - Big Data, DevOps, Java, Spark (7500)
Posted on Apr 17, 2021 by iET SA
Full Stack Developer, Engineer - Big Data, DevOps, Java, Spark
DevOps/Full Stack Developer - Big Data (Java/Spark) | Lausanne City | 12 Months | Daily rate upon request
For a long-term project at our client's site, an international bank in Lausanne, we are looking for a DevOps/Full Stack Developer - Big Data (Java/Spark).
The successful candidate will be working on a critical project for enabling the bank to comply with critical FINMA and local regulatory requirements. You will be working on a regulatory reporting product using Big Data (Java/Spark).
Please be aware that this is a role within a sensitive environment. You will have to undergo an enhanced background screening, which takes some additional time. We, therefore, prefer local candidates.
As of January 1, 2021, UK citizens will need work visa sponsorship in order to work in Switzerland. iET as an employment agency does not sponsor work visas. We can only work with UK citizens if they are already in Switzerland and possess a valid work permit.Your Qualifications:
- Several years' experience as DevOps/Full Stack Developer with 2 years' experience in GIT version control system to be able to manage various software build challenges
- Practical experience in any Linux distribution but preferable Red Hat family required to automate deployment and maintain the application hosted on that platform
- Over 2 years' experience in Python + Bash and readiness to work with Java as a minimum to work with professional code
- Any Docker experience is nice to have; the project is developing CI/CD on OpenShift platform
- Experience in GIT, Docker, Jenkins as well as Python Scripting and DevOps methodology would be an asset
- Fluent in English
- Initial setup an maintain of development and test environments
- Development tasks, implementing user stories
- 3rd level support activities such as investigate and resolve production issues that require IT involvement
- Deployments into UAT and production environments
- Support UAT testing when required
- Investigation of calculations issues performed on Hadoop cluster
- Identify problems and provide solutions with the automated deployments
- Work with development team to ensure the application runs correctly
Off to new destinations! Apply now directly or contact our team.