This Job Vacancy has Expired!

Python Complex Trading Data Platform Engineer - Outside IR35 - £ per day

Caspian One Ltd

Posted on Jul 17, 2021 by Caspian One Ltd

London, United Kingdom
IT
Immediate Start
Daily Salary
Contract/Project


My client is searching for Data Platform Engineers to develop a next-generation trade data warehousing and analysis platform. This advanced data platform provides reliable data validation, ingestion, and reconciliation streaming and batch data pipelines, and post-hoc data investigation, analysis, and report building tools. The goal is to set the bar for the most reliable, robust platform for processing and warehousing trade data across the industry, focusing first on regulatory reporting use cases, and expanding from there. Data Platform Engineers are responsible for building, evolving, and expanding this platform to enable teams across the company to build and deliver highly scalable, reliable and robust data products.

The platform will ingest all of the group's trade data (orders, executions and allocations) across its many business lines, and serve as the common clearing house for building out both reports for traders and regulatory reports. In addition, it provides a uniform view of reference data needed to understand the trade data, so we can quickly and effectively work with regulatory reports going back years, making corrections as needed.

The platform has near-Real Time data processing and validation on Kafka, as well as more batch processing after data is loaded into warehouses. The Platform helps define the data formats as well as how the many systems across the company will interact. We build components to allow for self-service access and querying of the data, as well as common tools and code to make it easy to work with the data in standard ways.

Responsibilities:

  • Develop, improve, test, and deploy new data pipelines, while normalizing and standardizing our various data flows

  • Work with teams across all of the company to provide access to and tools to utilize the data

  • Design and implement new ways of monitoring and reporting back to individual teams on data quality issues

  • Manage and triage data flow issues

  • Develop general tools for amending and exploring data across the platform

  • Write new platform reports to let individual teams know about and work on resolving their data quality issues

Qualifications:

  • Python infrastructure and service development

  • Experience building and working with ETL pipelines

  • Data warehouse design and operations

  • Build, deploy, and operate, and debug reliable, scalable infrastructures

  • Minimum 3+ years software development experience

  • Writes trustworthy software, with tests to ensure correctness

  • BS in Computer Science or another similar technical degree

Technologies:

  • Databases (desired): Snowflake, SQL Server, Sybase IQ

  • Languages: Python (required), SQL (familiarity required), C++ (preferable)

  • Technologies (preferable): ETL Systems, Data Warehouses, Kubernetes (preferable)




Reference: 1257690855

Set up alerts to get notified of new vacancies.