Spark Architect - PySpark

Posted on Jun 17, 2024 by Infoplus Technologies UK Ltd
Leeds, Yorkshire, United Kingdom
IT
Immediate Start
Annual Salary
Contract/Project

Mandatory Skills

You need to have the below skills.

  • At least 12+ Years of IT Experience with Deep understanding of component understanding around Spark Data Integration (PySpark, Scripting, variable setting etc.), Spark SQL, Spark Explain plans.
  • Spark SME - Be able to analyse Spark code failures through Spark Plans and make correcting recommendations.
  • To be able to traverse and explain the architecture you have been a part of and why any particular tool/technology was used.
  • Spark SME - Be able to review PySpark and Spark SQL jobs and make performance improvement recommendations.
  • Spark - SME Be able to understand Data Frames/Resilient Distributed Data Sets and understand any memory related problems and make corrective recommendations.
  • Monitoring -Spark jobs using wider tools such as Grafana to see whether there are Cluster level failures.
  • Cloudera (CDP) Spark and how the run time libraries are used by PySpark code.
  • Prophecy - High level understanding of Low-Code No-Code prophecy set up and its use to generate PySpark code.
  • Ready to work at least three days from HSBC Sheffield (UK) office and accept changes as per customer/Wipro policies.

Your responsibilities

As a Spark Architect you will be working for client - GDT (Global Data Technology) Team, you will be responsible for:

  • Working on an Enterprise scale Cloud infrastructure and Cloud Services in one of the Clouds (GCP).
  • Drive Data Integration upgrade to PySpark.

Reference: 2777473455

https://jobs.careeraddict.com/post/92077594

This Job Vacancy has Expired!

Spark Architect - PySpark

Posted on Jun 17, 2024 by Infoplus Technologies UK Ltd

Leeds, Yorkshire, United Kingdom
IT
Immediate Start
Annual Salary
Contract/Project

Mandatory Skills

You need to have the below skills.

  • At least 12+ Years of IT Experience with Deep understanding of component understanding around Spark Data Integration (PySpark, Scripting, variable setting etc.), Spark SQL, Spark Explain plans.
  • Spark SME - Be able to analyse Spark code failures through Spark Plans and make correcting recommendations.
  • To be able to traverse and explain the architecture you have been a part of and why any particular tool/technology was used.
  • Spark SME - Be able to review PySpark and Spark SQL jobs and make performance improvement recommendations.
  • Spark - SME Be able to understand Data Frames/Resilient Distributed Data Sets and understand any memory related problems and make corrective recommendations.
  • Monitoring -Spark jobs using wider tools such as Grafana to see whether there are Cluster level failures.
  • Cloudera (CDP) Spark and how the run time libraries are used by PySpark code.
  • Prophecy - High level understanding of Low-Code No-Code prophecy set up and its use to generate PySpark code.
  • Ready to work at least three days from HSBC Sheffield (UK) office and accept changes as per customer/Wipro policies.

Your responsibilities

As a Spark Architect you will be working for client - GDT (Global Data Technology) Team, you will be responsible for:

  • Working on an Enterprise scale Cloud infrastructure and Cloud Services in one of the Clouds (GCP).
  • Drive Data Integration upgrade to PySpark.

Reference: 2777473455

CareerAddict

Alert me to jobs like this:

Amplify your job search:

CV/résumé help

Increase interview chances with our downloads and specialist services.

CV Help

Expert career advice

Increase interview chances with our downloads and specialist services.

Visit Blog

Job compatibility

Increase interview chances with our downloads and specialist services.

Start Test

Similar Jobs

ICT Programme Manager

Leeds, Yorkshire, United Kingdom

Systems Engineer

Leeds, Yorkshire, United Kingdom

Transaction Monitoring Systems Lead

Leeds, Yorkshire, United Kingdom

Compliance Systems Manager

Leeds, Yorkshire, United Kingdom