Big Data Engineer - NIFI
Posted on May 12, 2022 by Flint Consulting Limited
Big Data Engineer - NIFI
My client, a well known glob al telecoms provider is currently looking for a n experienced Big Data Engineer to join them on a long term contract basis.
Please note this role is INSIDE IR35.
As the Big Data Developer of Networks data. You will be responsible for creating ETL pipelines using a wide range of technologies in Hadoop for complex Networks data and requirements. You will also be responsible for low level design and technologies. Become the valuable team member by setting up standards, deliverables and deployment components. To be able to do hands on development on NiFi and Hadoop technologies both in Streaming and Batch Model.
Key Responsibilities include:
- Develop and Deploy components in Hadoop. Create new ETL pipelines using NiFi, Spark, Kafka, Hive, HDFS, Hbase, Custom Scripts and any other latest technologies.
- Troubleshoot & Performance tune existing pipelines and bring improvements.
- Own the documentation and communicate to a wide range of audiences.
- Seek out and develop relationships with key stakeholders responsible for NetPulse Strategy
- Understand the capabilities of a wide range of technologies typically found within Hadoop landscape, including Open Source technologies and apply this knowledge to your solutions
- Understand the capabilities of Hadoop cloud platforms and provide solutions which makes it smooth for Cloud Migration
- Create quality solutions, often large scale and transactional in nature, for huge volumes of data.
- Orchestrate and run design workshops, and translate business strategy and requirements into solution architecture
- Understand common problems, design patterns and considerations for a distributed Hadoop architecture and apply this knowledge to your solutions
- Create and maintain key solution models and assets
- Resolve Big Data processing problems at enterprise level
- Understand the level at which solution architecture applies, and help transition your solutions to application and cloud architects and technical leads
- Have excellent communication, presentation and documentation skills
- Analyse and understand Existing Design with minimal documentation.
Key Skills and Experience:
- You are a highly experienced Big Data Developer/Engineer with deep understanding of the Hadoop Distributive File System and Eco System (HDFS, Map Reduce, Hive, Sqoop, Oozie, Zookeeper, HBase, Flume, Apache Kafka, Apache NiFi, Spark, Yarn, Knox & Tableau)
- You have hands on core development experience in NiFi, developing ETL pipelines end to end. (3+ years' experience desirable).
- You have Intermediate level Spark/Spark Streaming/Scala development experience. 2+ years' experience desirable.
- Development of Spark jobs using Spark, Python, Scala, Hive, HBase and HDFS
- Develop Shell Scripts for related projects.
- Knowledge of Airflow is desirable.
- Requirement Analysis and Solution Design for Networks Data Analysis
- Design Hadoop Frameworks for enriching multiple Network Data feeds
- Strong analytical, problem-solving and synthesizing skills
- Large-scale enterprise, object-oriented design and hands-on experience in building distributed systems
- Design and document HLD for major feeds by setting up standards and templates for team to use
- Setup Development and Design standards for Hadoop development.
- Support new team members to bring up to speed with development.
- Performance Tuning of Hadoop Jobs by defining configurations and Tables design.
- Design, Develop and Performance tune Hive SQL's
- Support AWS Cloud Migration Design
- Develop Streaming applications based on requirement by choosing appropriate streaming tool.
- Perform Design and Code Reviews, Improve, Optimize & Performance tune the code