Big Data Engineer (GCP)
Job Title: Big Data Engineer (GCP)
Location: Birmingham (Hybrid - 2 days onsite, 3 days remote)
Duration: 6 months + Extensions contract-based role inside IR35
Start Date: Immediate
Job Description: We are seeking a highly skilled Big Data Engineer with Google Cloud Platform (GCP) experience to join our dynamic team. This role requires expertise in designing, implementing, and optimizing data pipelines and architectures on GCP. You will work closely with data scientists, analysts, and software engineers to develop scalable and efficient data solutions.
Key Responsibilities:
-
Design, develop, and maintain data pipelines and ETL processes on GCP.
-
Work with BigQuery, Dataflow, Pub/Sub, Cloud Storage, and other GCP services.
-
Implement Real Time and batch data processing solutions.
-
Optimize and scale data storage, transformation, and retrieval processes.
-
Ensure data security, governance, and compliance with best practices.
-
Collaborate with cross-functional teams to understand data requirements and implement solutions.
-
Troubleshoot and resolve data-related issues.
Key Skills & Experience:
-
Strong experience in Big Data Engineering with hands-on GCP expertise.
-
Proficiency in BigQuery, Dataflow, Dataproc, Cloud Composer, and Pub/Sub.
-
Strong SQL skills and experience with data modelling.
-
Expertise in Python, Java, or Scala for data processing.
-
Experience with Apache Spark, Hadoop, and Kafka is a plus.
-
Knowledge of CI/CD pipelines, Infrastructure as Code (Terraform, Cloud Deployment Manager).
-
Strong understanding of data warehousing, ETL, and data lake architectures.
-
Experience in performance tuning and optimization of data processes.
Reference: 2887551480
Big Data Engineer (GCP)

Posted on Jan 27, 2025 by Intuition IT Solutions Ltd
Job Title: Big Data Engineer (GCP)
Location: Birmingham (Hybrid - 2 days onsite, 3 days remote)
Duration: 6 months + Extensions contract-based role inside IR35
Start Date: Immediate
Job Description: We are seeking a highly skilled Big Data Engineer with Google Cloud Platform (GCP) experience to join our dynamic team. This role requires expertise in designing, implementing, and optimizing data pipelines and architectures on GCP. You will work closely with data scientists, analysts, and software engineers to develop scalable and efficient data solutions.
Key Responsibilities:
-
Design, develop, and maintain data pipelines and ETL processes on GCP.
-
Work with BigQuery, Dataflow, Pub/Sub, Cloud Storage, and other GCP services.
-
Implement Real Time and batch data processing solutions.
-
Optimize and scale data storage, transformation, and retrieval processes.
-
Ensure data security, governance, and compliance with best practices.
-
Collaborate with cross-functional teams to understand data requirements and implement solutions.
-
Troubleshoot and resolve data-related issues.
Key Skills & Experience:
-
Strong experience in Big Data Engineering with hands-on GCP expertise.
-
Proficiency in BigQuery, Dataflow, Dataproc, Cloud Composer, and Pub/Sub.
-
Strong SQL skills and experience with data modelling.
-
Expertise in Python, Java, or Scala for data processing.
-
Experience with Apache Spark, Hadoop, and Kafka is a plus.
-
Knowledge of CI/CD pipelines, Infrastructure as Code (Terraform, Cloud Deployment Manager).
-
Strong understanding of data warehousing, ETL, and data lake architectures.
-
Experience in performance tuning and optimization of data processes.
Reference: 2887551480

Alert me to jobs like this:
Amplify your job search:
Expert career advice
Increase interview chances with our downloads and specialist services.
Visit Blog