Big Data Engineer
Posted on Sep 20, 2019 by Request Technology - Craig Johnson
Prestigious Fortune 500 Company is currently seeking a Big Data Engineer. Candidates responsible for building new and advanced analytics applications on Big Data platforms like Hadoop, Kafka etc. Candidate will be accountable for end to end engineering of data solutions which includes designing and building systems for data storage and analytics that enable analysts to make better decisions to achieve goals. This includes the development of new systems for analysing data; the coding & development of advanced analytics solutions to make/optimize business decisions and processes and discovery of new technical challenges that can be solved with existing and emerging Big Data hardware and software solutions.
- Uses and learns a wide variety of tools including R, Python, Scala and any other future analytic systems or languages
- Researches and evaluates new tools and technologies to solve business problems
- Responsible for designing and building new Big Data applications for analysing data into actionable insights
- Drive projects from technical expertise.
- Partner with technology and the analytics community on big data efforts.
- Mentor other team members and participate in cross-training.
- Support highly available environments that are part of critical business processes.
- Utilize effective project planning techniques to break down highly complex projects into tasks, manage scope of projects, and ensure deadlines are kept.
- Keep up with current technology trends in the Big Data and Analytic space.
- Work on an geographically dispersed team embracing Agile and DevOps strategies for themselves and others while driving adoption to enable greater technology and business value
- Experience in Java, Spark, Python and/or Scala
- Experience with source control solutions (ex git, GitHub)
- Experience managing and manipulating large datasets.
- Experience and solid understanding of Bigdata ecosystems such as Hadoop, Spark, Kafka
- Experience with various data types (eg Relational, Unstructured, Hierarchical, Linked Graph Data)
- Experience with database & ETL technologies
- Experience developing large-scale distributed applications in Cloudera Hadoop, Hive, and/or Impala
- Good verbal and written communication skills
- Open to new ideas and technologies with a strong desire to learn
- Ability to analyze and solve problems in a creative and innovative fashion
- Ability to research and document project design, approach and architecture
- Ability to engage subject matter experts and translate business goals into actionable solutions
- Ability to work effectively with business and technical teams
- Ability to meet deadlines, goals and objectives
- Works collaboratively with multiple areas and comfortable leading projects and mentoring others
- Familiarity and comfort with participating in a high-performing geographically dispersed team
- Understanding of predictive modelling techniques is a plus
- Strong experience in relevant technologies.
- Bachelors/Masters degree, preferably in a Computing or Information Sciences field.
- Intermediate UNIX/Linux skills.
- Knowledge of one or more of the following: Java, Scala, Python.
- Fundamental knowledge of SQL.
- Understanding of JSON, AVRO, Parquet, XML etc.
Set up alerts to get notified of new vacancies.
$90k - $120k Annual
$100k - $120k Annual