GCP Data Engineer - 6 Month Contract - Inside IR35 - Hybrid

Hamilton Barnes

Posted on May 13, 2024 by Hamilton Barnes
Cardiff, South Glamorgan, United Kingdom
IT
Immediate Start
Daily Salary
Contract/Project

GCP Data Engineer - 6 Month Contract - Inside IR35 - Hybrid

Are you ready to take your expertise in data engineering to the next level? Join us for a thrilling opportunity as a GCP Data Engineer, working on a 6-month contract (Inside IR35) with the potential for extension beyond 12 months. This role offers a chance to collaborate with a market-leading banking client, shaping the future of data management.

As a GCP Data Engineer, you'll be instrumental in designing, implementing, and optimizing data solutions on Google Cloud Platform (GCP). Your role will involve working onsite for 3 days per week at our Cardiff-based office, collaborating closely with our banking client to deliver exceptional results.

Key Responsibilities:

  • Utilize expertise in batch, microbatch, event-based, and data streaming architectures to design solutions for replicating, transforming, and processing data effectively.
  • Hands-on configuration of Kafka connectors for both batch processing and streaming, including source and sink connectors. Optimize the number of connectors to enhance performance and efficiency.
  • Configure Kafka brokers to ensure optimal performance and reliability. Implement security measures and schema governance practices to maintain data integrity and compliance.
  • Apply strong database modelling concepts and SQL skills to design and optimize database structures. Develop and execute SQL queries to extract, transform, and load data efficiently.
  • Collaborate with cross-functional teams to gather requirements and translate them into technical solutions. Document architectural designs, configurations, and best practices for knowledge sharing and future reference.

What you will Ideally Bring:

  • Proficiency in Google Cloud Platform (GCP) DataFlow, including Apache Beam, for building and executing data processing pipelines.
  • Strong coding skills in Python for developing custom data processing logic and transformations within DataFlow pipelines.
  • Experience with log-based Change Data Capture (CDC) using Confluent Kafka Connectors for low-latency data replication
  • In-depth knowledge of Kafka broker configuration, including topics, partitions, replication, and optimization for performance and reliability
  • Proficiency in implementing security measures such as SSL/TLS encryption, SASL authentication, and ACL-based authorization to secure Kafka clusters
  • Hands-on experience with Confluent Control Centre for monitoring and managing Kafka clusters, topics, and consumer groups
  • Proficiency in using tools such as GitHub for version control, Confluence for documentation, and Jenkins for continuous integration and deployment (CI/CD) processes

Contract Details:

  • Duration: 6 months
  • Location: 3x Per Week Cardiff
  • Day Rate: Up to £550 Per Day (Inside IR35)

GCP Data Engineer - 6 Month Contract - Inside IR35 - Hybrid


Reference: 2759770529

https://jobs.careeraddict.com/post/90702427

This Job Vacancy has Expired!

Hamilton Barnes

GCP Data Engineer - 6 Month Contract - Inside IR35 - Hybrid

Hamilton Barnes

Posted on May 13, 2024 by Hamilton Barnes

Cardiff, South Glamorgan, United Kingdom
IT
Immediate Start
Daily Salary
Contract/Project

GCP Data Engineer - 6 Month Contract - Inside IR35 - Hybrid

Are you ready to take your expertise in data engineering to the next level? Join us for a thrilling opportunity as a GCP Data Engineer, working on a 6-month contract (Inside IR35) with the potential for extension beyond 12 months. This role offers a chance to collaborate with a market-leading banking client, shaping the future of data management.

As a GCP Data Engineer, you'll be instrumental in designing, implementing, and optimizing data solutions on Google Cloud Platform (GCP). Your role will involve working onsite for 3 days per week at our Cardiff-based office, collaborating closely with our banking client to deliver exceptional results.

Key Responsibilities:

  • Utilize expertise in batch, microbatch, event-based, and data streaming architectures to design solutions for replicating, transforming, and processing data effectively.
  • Hands-on configuration of Kafka connectors for both batch processing and streaming, including source and sink connectors. Optimize the number of connectors to enhance performance and efficiency.
  • Configure Kafka brokers to ensure optimal performance and reliability. Implement security measures and schema governance practices to maintain data integrity and compliance.
  • Apply strong database modelling concepts and SQL skills to design and optimize database structures. Develop and execute SQL queries to extract, transform, and load data efficiently.
  • Collaborate with cross-functional teams to gather requirements and translate them into technical solutions. Document architectural designs, configurations, and best practices for knowledge sharing and future reference.

What you will Ideally Bring:

  • Proficiency in Google Cloud Platform (GCP) DataFlow, including Apache Beam, for building and executing data processing pipelines.
  • Strong coding skills in Python for developing custom data processing logic and transformations within DataFlow pipelines.
  • Experience with log-based Change Data Capture (CDC) using Confluent Kafka Connectors for low-latency data replication
  • In-depth knowledge of Kafka broker configuration, including topics, partitions, replication, and optimization for performance and reliability
  • Proficiency in implementing security measures such as SSL/TLS encryption, SASL authentication, and ACL-based authorization to secure Kafka clusters
  • Hands-on experience with Confluent Control Centre for monitoring and managing Kafka clusters, topics, and consumer groups
  • Proficiency in using tools such as GitHub for version control, Confluence for documentation, and Jenkins for continuous integration and deployment (CI/CD) processes

Contract Details:

  • Duration: 6 months
  • Location: 3x Per Week Cardiff
  • Day Rate: Up to £550 Per Day (Inside IR35)

GCP Data Engineer - 6 Month Contract - Inside IR35 - Hybrid

Reference: 2759770529

CareerAddict

Alert me to jobs like this:

Amplify your job search:

CV/résumé help

Increase interview chances with our downloads and specialist services.

CV Help

Expert career advice

Increase interview chances with our downloads and specialist services.

Visit Blog

Job compatibility

Increase interview chances with our downloads and specialist services.

Start Test

Similar Jobs

Full Stack Developer

Cardiff, South Glamorgan, United Kingdom

Customer service

Cardiff, South Glamorgan, United Kingdom

Oracle DBA

Cardiff, South Glamorgan, United Kingdom

Contract Firmware Engineer

Cardiff, South Glamorgan, United Kingdom