Data DevOps Engineer - DevOps, Big data- Permanent - Gloucestershire
Job Title - Data DevOps Engineer - DevOps, Big data- Permanent - Gloucestershire
Location - Gloucestershire/Bristol
Salary - £65 - £85K per annum Negotiable DOE
Benefits - Flexible working hours, career opportunities, private medical, excellent pension, and social benefits
The Client - Curo are collaborating with a global edge-to-cloud company advancing the way people live and work. They help companies connect, protect, analyse, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today's complex world.
The Candidate - We are looking for a bright, driven, customer focussed professional to join our clients Hybrid Cloud Delivery team, and work alongside Enterprise Data Engineering Consultants to accelerate and drive data engineering opportunities.
This is a fantastic opportunity for a dynamic individual with big ambitions, who is an established technologist with both outstanding technical ability and consultative mindset. This would suit an open-minded personable self-starter who relishes the fluidity and collaborative nature of consultancy.
Please Note - Candidates must be eligible for DV Clearance, and willing to work fully on-site.
The Role - This role sits on our clients Advisory and Professional Services delivery team, who provide thought-leadership, industry know-how and technical excellence to consultative engagements. Helping customers to reap maximum business benefit from their technical investments, leveraging best in class Vender & Partner technologies to create relevant and effective business-valued technical solutions.
The Data DevOps Engineer role is all about the detailed development and implementation of scalable clustered Big Data solutions, with a specific focus on automated dynamic scaling, self-healing systems.
Duties -
Participating in the full life cycle of data solution development, from requirements engineering through to continuous optimisation engineering and all the typical activities in between
Providing technical thought-leadership and advisory on technologies and processes at the core of the data domain, as well as data domain adjacent technologies
Engaging and collaborating with both internal and external teams and be a confident participant as well as a leader
Assisting with solution improvement activities driven either by the project or service
Essential Requirements -
- Excellent knowledge of Linux operating system administration and implementation
- Broad understanding of the containerisation domain adjacent technologies/services, such as: Docker, OpenShift, Kubernetes etc.
- Infrastructure as Code and CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc.
- Monitoring utilising products such as: Prometheus, Grafana, ELK, filebeat etc.
- Observability - SRE
- Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem
- Edge technologies eg NGINX, HAProxy etc.
- Excellent knowledge of YAML or similar languages
Desirable Requirements -
- Jupyter Hub Awareness
- Minio or similar S3 storage technology
- Trino/Presto
- RabbitMQ or other common queue technology eg ActiveMQ
- NiFi
- Rego
- Familiarity with code development, Shell-Scripting in Python, Bash etc.
To apply for this Data DevOps Engineer permanent job, please click the button below and submit your latest CV.
Curo Services endeavours to respond to all applications, however this may not always be possible during periods of high volume. Thank you for your patience.
Curo Services is a trading name of Curo Resourcing Ltd and acts as an Employment Business for contract and temporary recruitment as well as an Employment Agency in relation to permanent vacancies.
Reference: 2679158639
Data DevOps Engineer - DevOps, Big data- Permanent - Gloucestershire

Posted on Nov 15, 2023 by Curo Services
Job Title - Data DevOps Engineer - DevOps, Big data- Permanent - Gloucestershire
Location - Gloucestershire/Bristol
Salary - £65 - £85K per annum Negotiable DOE
Benefits - Flexible working hours, career opportunities, private medical, excellent pension, and social benefits
The Client - Curo are collaborating with a global edge-to-cloud company advancing the way people live and work. They help companies connect, protect, analyse, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today's complex world.
The Candidate - We are looking for a bright, driven, customer focussed professional to join our clients Hybrid Cloud Delivery team, and work alongside Enterprise Data Engineering Consultants to accelerate and drive data engineering opportunities.
This is a fantastic opportunity for a dynamic individual with big ambitions, who is an established technologist with both outstanding technical ability and consultative mindset. This would suit an open-minded personable self-starter who relishes the fluidity and collaborative nature of consultancy.
Please Note - Candidates must be eligible for DV Clearance, and willing to work fully on-site.
The Role - This role sits on our clients Advisory and Professional Services delivery team, who provide thought-leadership, industry know-how and technical excellence to consultative engagements. Helping customers to reap maximum business benefit from their technical investments, leveraging best in class Vender & Partner technologies to create relevant and effective business-valued technical solutions.
The Data DevOps Engineer role is all about the detailed development and implementation of scalable clustered Big Data solutions, with a specific focus on automated dynamic scaling, self-healing systems.
Duties -
Participating in the full life cycle of data solution development, from requirements engineering through to continuous optimisation engineering and all the typical activities in between
Providing technical thought-leadership and advisory on technologies and processes at the core of the data domain, as well as data domain adjacent technologies
Engaging and collaborating with both internal and external teams and be a confident participant as well as a leader
Assisting with solution improvement activities driven either by the project or service
Essential Requirements -
- Excellent knowledge of Linux operating system administration and implementation
- Broad understanding of the containerisation domain adjacent technologies/services, such as: Docker, OpenShift, Kubernetes etc.
- Infrastructure as Code and CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc.
- Monitoring utilising products such as: Prometheus, Grafana, ELK, filebeat etc.
- Observability - SRE
- Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem
- Edge technologies eg NGINX, HAProxy etc.
- Excellent knowledge of YAML or similar languages
Desirable Requirements -
- Jupyter Hub Awareness
- Minio or similar S3 storage technology
- Trino/Presto
- RabbitMQ or other common queue technology eg ActiveMQ
- NiFi
- Rego
- Familiarity with code development, Shell-Scripting in Python, Bash etc.
To apply for this Data DevOps Engineer permanent job, please click the button below and submit your latest CV.
Curo Services endeavours to respond to all applications, however this may not always be possible during periods of high volume. Thank you for your patience.
Curo Services is a trading name of Curo Resourcing Ltd and acts as an Employment Business for contract and temporary recruitment as well as an Employment Agency in relation to permanent vacancies.
Reference: 2679158639

Alert me to jobs like this:
Amplify your job search:
Expert career advice
Increase interview chances with our downloads and specialist services.
Visit Blog