Big Data Platform Engineer - Hadoop/DevOps/Cloud
Posted on Feb 7, 2020 by Orcan Intelligence
For our prestigious customer, we are looking for a Big Data Platform Engineer for a 3-month contract with the high likelihood of extension in Switzerland.
- A challenging and interesting position as a Big Data Platform Engineer in the team
- In this role you are developing and deploying solutions to some of the company's most exciting analytic and big data problems
- Being part of this highly motivated and diverse team, you work with clients and data spanning global organizations to solve emerging critical challenges via the utilization of new technologies
- You develop yourself within a team of Big Data engineers who are building the platform and innovating in core areas, Real Time analytics and large-scale data processing within different environments (including main market cloud solutions)
- Experience with one or more major Hadoop distributions and various ecosystem components (eg HDFS, Sqoop, Impala, Spark, Flume, Kafka, Nifi etc.) is a must experience with Data Concepts (ETL, near-/Real Time streaming, data structures, metadata and workflow management) would be a big strength
- Prior hands on big data platform development experience in large projects with ability to translate technical designs into working solutions and to work closely with Architect and Big Data Project teams to work out platform features
- Good Programming/Scripting Skills (Python, Java, C/C, Scala, Bash, Korn Shell) and understanding of DevOps Tools (Chef, Docker, Puppet, Bamboo, Jenkins)
- Ability to work well as part of global and cross-cultural teams, excellent verbal and written communication skills (in English) with ability to communicate effectively
- You have a bachelor degree in Computer Science or an equivalent education in related disciplines
If you have right skill-set and are interested in applying, please send over your up to date CV for immediate consideration.