Big Data DevOps Engineer - Dublin
Posted on Aug 19, 2020 by Caspian One Ltd
I have the pleasure of working with a Top Tier Investment Bank in Dublin. They are building out a new team and are looking for Big Data Engineers to support migrate services to the public cloud using Azure.
The ideal candidate will have previous public cloud experience delivering enterprise data solutions within financial services including knowledge of the security and regulatory requirements. The role will initially focus on delivering Azure SQL Database followed by adoption of additional Azure data services.
If you are looking for the opportunity to be apart of a newly established team and would like to progress to the next step in your career, then look no further!
Roles and Responsibilities:
Strong programming skill with experience in API and Webhook development using Python, Ruby, PowerShell and Shell Scripting languages
Experience with automating (provisioning, configuration management, deployment...) and integrating Azure Data PaaS solutions (Azure SQL, Azure Cosmos DB, Azure SQL Datawarehouse, Azure Data Lake, Azure Databricks, Azure HDInsight).
Design and build modern Data Pipelines/Streams and Data Service APIs to assist with data migration (on-premise SQL to Azure SQL databases, on-premise Hadoop to Azure Data lake)
Write and use Azure RM templates
Understand product features for DR/BCP options and how fits in overall application architecture
Experience deploying/configuring Azure Data services: Databricks, ADLS, Blob, Data Factory, Data Lake, SQL Data Warehouse...
3+ years of experience developing platform orchestration code in Python and Groovy
3+ years of experience writing automation pipelines in Jenkins
Demonstrated knowledge of cloud provisioning and administration,