This Job Vacancy has Expired!

Solution Architect-Permanent-Hybrid

Posted on Apr 7, 2022 by McCabe & Barton

London, United Kingdom
Immediate Start
£85k - £110k Annual

Overview of the role

You will be responsible for the design, lead, and build of strategic data integration and data analytics solutions for our sales and marketing reporting and analytics processes.

Duties and responsibilities

You will:

  • Guide the data and solutions architecture of the Global Client Analytics platform, specifically for entities AUM, Flows, and Client Interactions. Provide system blueprints and logical data models to describe the relationships between them.
  • Perform design, documentation, development, testing, installation and support of complex data ETL/ELT pipelines, using Azure Databricks, Informatica, AWS Glue, Azure Data Factory, or similar ETL/ELT platforms, as well as knowledge of data warehousing, data mart, and data lake architectures, concepts and processes
  • Provide BAU support on existing data designs in Oracle, SQL Server, Azure Databricks Delta Lake, and Snowflake
  • Provide testing support where necessary. Participate in the design and evolution of Dev Ops for Global Client Analytics.
  • Create or direct the creation of associated scheduling logic for data pipelines using scheduling tool such as Autosys, Azure Databricks
  • Work within the agile delivery methodology, picking up data development tasks within a 2-week sprint, providing work estimates for tasks to project management, and participating in agile ceremonies.
  • Following existing Janus standards and procedures for data movement, as well as provide suggestions for new standards and procedures as the move toward cloud technologies matures
  • Carry out additional duties as assigned.

Technical skills and qualifications

  • One year of experience building cloud solution designs specific to data architecture or data engineering in either Google, Azure, or AWS.
  • Five years of experience developing, maintaining, and supporting ETL/ELT programs with various target/source types and transformation logic
  • Three years of experience in data warehouse (with large datasets) and Business Intelligence environment working on the implementation of an enterprise data management processes, procedures, and decision support using Informatica Power Center v10.x, Data Quality (IDQ), Data Profiling (IDE) or using Microsoft SQL Server SSIS, stored procedures, or similar
  • Experience building data pipelines for BI tools and/or data lake implementations using cloud-native tools such as AWS Glue, Azure Data Factory, Google Dataflow, and/or Python.
  • Experience using event-driven architectures (ie pub/sub messaging) to drive data pipelines is a plus.
  • Experience with Tableau, Cognos, Qlikview, or similar BI tools is a plus.
  • Must be familiar with Agile delivery methodology and have participated in the writing of users stories and all agile ceremonies.

Competencies required

In addition to putting clients first, acting like an owner, and succeeding as a team, the competencies for this role include:

  • Working knowledge of information technology methodologies and standards
  • Problem solving and analysis; understanding of complex issues and problems and able to design pragmatic but robust solutions.
  • Self-motivation; possessing a natural ability to take initiative and progress projects and initiatives with minimal support.
  • Strong written and verbal communication and reasoning skills.
  • Time management and planning; managing personal workload effectively, setting realistic and achievable targets, managing expectations and delivering to those targets.

Reference: 1558542916

Set up alerts to get notified of new vacancies.