Data Analytics Engineer
Posted on Jun 2, 2021 by Michael Bailey Associates - Zurich
Micahel Bailey Associates is currently looking for a Data Analytics Engineer for a client within the banking industry.
This is a contract until the end of March with the possibility of getting a further extension.
This role is tackling a range of complex software and data challenges, including data management, advanced analytics and business intelligence. This role is pivotal to implement, maintain and support data pipelines using shared data platforms exploiting cutting-edge data technologies and modern software development practices.
* Provide implementation support on a variety of data management and analytics projects using the Bank's approved databases (big data as well as relational databases) and analytical technologies
* Responsible for development and following best practices for the big data environment
* Develop required code for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and 'big data' technologies.
* Participate and contribute in overall data architecture and the design of Big Data & Advanced Analytics solutions
* Develop end-to-end data pipelines; defining database structure, applying business rules and functional capabilities, security, data retention requirement etc
* Work with the big data solution architect to strive for greater standardisation across various data pipelines
* Remain up-to-date with industry standards and technological advancements that will improve the quality of your outputs
* Propose and implement ways to improve data quality, reliability & efficiency of the whole system.
* Interact with the business to identify, capture and analyse business requirements
* Develop functional specifications in a team environment, as well as derive use cases where appropriate
* Assist and support proof of concepts as Big Data technology evolves
* Minimum 5+ years work experience in Big data technology space.
* Experience in building data ingestion pipelines (simulating ETL or ELT workloads for a data warehouse or data lake architecture.
* Experience with Hadoop, Hive, Spark, HBase, Kafka, Impala, ELK etc. preferably with Cloudera Strong experience in Datamodelling, design patterns and building highly scalable Big Data applications.
* Experience in designing and developing large-scale distributed applications using HBase/Hadoop/Spark/Scala.
* Experience with relational SQL and NoSQL databases: SQL Server, Postgres, Cassandra, etc.
* Experience with object-oriented/object function Scripting languages: Python, Java, C++, Scala, etc.
* Knowledge of Service-oriented architecture and experience in API creation
* Experience in Automated Testing, Test-driven Development, debugging, troubleshooting, and optimizing code.
* Advanced working SQL knowledge and experience.
Looking forward for receiving your applications :)
Michael Bailey International is acting as an Employment Business in relation to this vacancy.