GCP Data Engineer

Posted on Jul 21, 2022 by Conexus
Paris, France
IT
Immediate Start
€500 - €500 Daily
Contract/Project
Conexus are currently working with a global consultancy who require a GCP Data Engineer to join a client headquartered in Paris on an initial 6-12 month freelance contract.
This is a hybrid position located in Paris, however the level of travel is negotiable.
The successful candidate will have experience in a Data Engineer role, and be familiar with the Google Cloud Platform.
They should also demonstrate their experience using the following technologies:
- SQL authoring, query and cost optimisation primarily on BigQuery.
- Data pipeline, data streaming and workflow management tools: DataFlow, PubSub, Hadoop, Spark-streaming
- Python as an object-oriented Scripting language.
- Version control system: GIT.
- Preferable knowledge of Infrastructure as Code: Terraform.
- Create and maintain optimal data pipeline architecture, while automating manual processes, optimising data delivery, re-designing infrastructure for greater scalability, etc.
- Assemble large, complex data sets that meet functional/non-functional business requirements.
- Build the infrastructure required for optimal extraction, transformation, and loading data from a wide variety of data sources using SQL and GCP Big Data technologies.
- Build analytics tools that utilise the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with data and analytics experts to strive for greater functionality in data systems.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimising data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- Working knowledge of message queuing, stream processing, and highly scalable data stores.
- Experience supporting and working with cross-functional teams in a dynamic environment.
If this opportunity would be of interest to you, please forward a CV for consideration.
Reference: 1675221133