Posted on Sep 2, 2021 by Request Technology
*We are unable to sponsor as this is a permanent full time role*
A prestigious company is on the search for a Data Engineer. This company is looking for someone who has 1 to 3 years of Spark and Scala knowledge. This engineer needs to know Big Data Platform and can help develop the platform components into the cloud. Any cloud provider experience is fine. There will be 8 other people on this team of data engineers.
- Drive the implementation of the defined innovation strategy through the design and architecture for analytics platform components/services utilizing Google Cloud Platform infrastructure and provide technical expertise to deliver advanced analytics
- Design, build & operate a secure, highly-scalable, reliable platform to consume, integrate and analyze complex data using a variety of best-in-class platforms and tools
- Collaborate with various global IT teams to develop Big Data reference architecture patterns for data ingestion, data processing, analytics and data science solutions
- Drive innovation through developing proof-of-concept's and prototypes to help illustrate approaches to technology and business problems
- Provide strong technical skills with ability to design, architect, and get into low-level implementation details
- Be a hands-on developer and build scalable, Real Time, Big Data systems whilst providing a deep business understanding of CPG functions & processes
- Proven experience in working with globally distributing agile teams
- Develop and maintain modern development & data management methodology
- A bachelor's degree in computer science, math, physics, engineering or a related field
- 2+ years of progressive experience in data and analytics, with at least 6 months experience in Google cloud platform
- Overall, 2+ years of progressive experience working in software engineering teams
- Experience with Google Cloud Platform technology stack for Data Analytics and Big data technologies (Big Query - DWH concepts, ANSI SQL, Big Table - HBase, DataProc - Spark/Hadoop, Dataflow - Apache Beam/Scala/Python) is required
- Experience in developing the data pipeline, ETL processes, data platform, Datamodelling, and data governance processes in delivering end-to-end solutions
- Solid foundation in design, data structures and algorithms, and strong analytical and debugging skills with customer-facing products experience
- Good understanding of private and public cloud design considerations and limitations in the areas of virtualization, global infrastructure, distributed systems, load balancing, networking, massive data storage, Hadoop, MapReduce, and security.
- Strong programming experience in Python and Scala
- Experience of using development tools like Eclipse and IntelliJ
- Experience of using Jenkins, GitHub flow and artifact repo.
- Experience developing Real Time data streaming pipelines
- Drive innovation by assessing, piloting, building DevOps/Cloud tooling and services to improve overall developer experience and productivity
Set up alerts to get notified of new vacancies.
$140k - $180k Annual
$160k - $200k Annual
$200k - $250k Annual
$140k - $160k Annual
$120k - $135k Annual
$80k - $110k Annual
$110k - $135k Annual