Data Engineer - Remotely
Posted on Feb 4, 2020 by E-Frontiers
We are looking for a senior data engineer, you will be part of an international, talented and motivated team that uses their combined knowledge of machine learning, natural language processing and big data technologies to build and continuously evolve the company data pipeline. You are the perfect candidate if you have previous experience and interest in big-scale data processing, preferably on reactive streaming architectures.
Our engineers are willing to walk the extra mile to make sure our solution is not only good, but great and beneficial for millions of travelers. You will systematically look for weaknesses and areas for improvement in our approach, and will be motivated by them to create even better solutions.
What challenges await you?
- Being a member of the Data team, you will be working in a cross-discipline delivery team focused on servicing our many core data products
- Gather and process raw data at scale using frameworks such as Hadoop, Spark, and Kafka
- Maintain and write new data processing pipelines handling hundreds of GB of data
- Optimize and improve existing features or data processes for performance and stability
- Take our data pipeline to the next level! We'd love to hear your ideas!
What do we expect from you?
- Experience building data pipelines, ideally streaming
- Very strong programming and architectural experience, ideally in Python but we are open to other experience
- You find creative solutions to tough problems. You are not only a great data engineer; you are also an architect who is not afraid to pave the way for bigger and better things
- Experience in SQL, relational or non-relational DB
- You consider it necessary to develop automatic tests to create maintainable code
- Team player who likes to help and to keep the focus on getting things done
- Proactivity and a continuous improvement mindset
Nice to have
- Expert-level knowledge in Python/Scala/Java. Experience in frameworks such as Luigi/Airflow is a plus.
- Experience with Big data technologies (Hadoop, Spark, Flink, Hive, Impala, HBase, Pig, Redshift, Kafka)
- Experience building REST-APIs
- Experience with AWS or other IaaS/PaaS.
- Experience with agile development practices such as CI, CD, TDD, Pair-Programming, Refactoring, etc.
- Experience with Docker and cluster management tools (YARN, Mesos, etc.) is a plus.
- Experience with Event-Driven Architectures and Domain-Driven Design is a plus.
What do we offer?
- A highly international & entrepreneurial environment (flat hierarchies, short decision paths)
- Spend up to 5% of your working time for your professional & personal development
- Take part in shaping our new office in Madrid city center!
- Flexible working hours/Options to work remotely
Please, feel free to reach me out for more info! ( (see below) )