Big Data Engineer
Posted on Mar 24, 2021 by Request Technology - Craig Johnson
*We are unable to sponsor for this permanent Full time role*
Prestigious Enterprise Company is currently seeking a Big Data Engineer with AWS experience. Candidate will design, develop, implement, test, document, and operate large-scale, high-volume, high-performance data structures for analytics and deep learning. Implement data ingestion routines both Real Time and batch using best practices in data modelling, ETL/ELT processes leveraging AWS technologies and Big data tools. logical abstraction layer against large, multi-dimensional datasets and multiple sources. Gather business and functional requirements and translate these requirements into robust, scalable, operable solutions that work well within the overall data architecture. Produce comprehensive, usable dataset documentation and metadata. Provides input and recommendations on technical issues to the project manager.
- 5+ years of experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools.
- 5+ years of work experience with very large data warehousing environment
- 3+ years of experience data modelling concepts
- 3+ years of Python development experience
- 2+ years' experience in Big Data stack environments (EMR, Hadoop, Glue, Hive)
- Experience with Kafka, Flume and AWS tool stack such as Redshift and Kinesis are preferred.
- Experience in writing Spark ETL jobs
- Experience using software version control tools (Git, Jenkins, Apache Subversion)
- Demonstrated strength in architecting data warehouse solutions and integrating technical components
- Good analytical skills with excellent knowledge of SQL.
- Excellent communication skills, both written and verbal
Experience working with solutions like Golden Gate, Syncsort, Attunity
- Java Development Experience is Preferred
- Experience in gathering requirements and formulating business metrics for reporting.
- Experience with Kafka, Flume and AWS tool stack such as Glue ETL, Redshift and Kinesis are preferred.
- Experience building on AWS using S3, EC2, Redshift, DynamoDB, Lambda, QuickSight, etc.
- AWS certifications or other related professional technical certifications
- Experience with cloud or on-premise Middleware and other enterprise integration technologies
- Bachelor's Degree
- Relevant Experience or Degree in: Computer Science, Management Information Systems, Business or related field
- Typically Minimum 4+ Years Relevant Exp
- Four-year college degree and 4 or more years, and/or a high school diploma with 6 or more years professional experience in full life cycle design and development to include IT architecture, banking industry experience, and understanding client requirements
Set up alerts to get notified of new vacancies.