Azure/Big Data Engineer
Posted on Jan 21, 2020 by IDC Technologies Solutions Ltd
- COE is a central function providing services such as Consultancy, Advisory, Best practices, Standards & Guidelines and Design Patterns.
- Partner with Business Delivery Verticals in Designing and implementing cloud solutions which are secure, scalable, resilient, monitored, auditable and cost optimized.
- Partner with Business Delivery Verticals providing COE Consultants with Technical Expertise.
- Work with ITSO Portfolio teams to understand the roadmap for current and future D&A capabilities and build service launch calendar as per the roadmap.
- Drive User Group Meetings and provide regular updates to the Business D&A teams
- Collaborate with Supplier and Shell staff to understand requirements for Data and Analytics that support Shell requirements and strategies
- Use the service wheel and drive improvements
- This role will strongly interface with other D&A natural teams such as Architects, Portfolio, Operations, Project Delivery, Governance, etc.
- Operate in a global environment
Responsibilities & Accountabilities
- Provide technical expertise within D&A team
- Within own scope - taking the lead to plan, manage & deliver according to demand pipeline
- Deliver through others within the natural teams - Portfolio, Projects, Architects, Operations, Business Analytics Teams, Compliance, IRM & Suppliers
- Innovate and actively drive Proof of Concepts in D&A space
- Taking initiatives to improve adoption of D&A Services (Analytics and Visualization)
- Help define the Service Strategy and guides the transformation of the Business lines into a truly Agile, DevOps organization with Continuous Delivery Capability
- Team player - working in an environment that promotes high level of personal commitment and accountability and a continued focus on quality and efficiency
- Help drive D&A knowledge sessions on focus areas that create business value. Key to this is translating IT conversations into Business conversations; the messages need to be delivered in business terms not technology terms
- Individual contributor
Skills & Requirements:
- Solid hands on experience in a full life cycle Data Analytics projects
- Solid foundation in Big Data Technology - DFS, Hadoop related stacks, Data Processing, Governance, Analysis Mechanisms (Statistics, NLP & Visual) & Machine Learning
- Solid knowledge and experience Azure Data Analytics services such as but not limited to Azure Data Factory, Azure Databricks, Azure Data Lakes, Azure SQL Datawarehouse, Azure Analysis services, Stream Analytics, Azure ML/AI, Cognitive etc.
- Knowledge and experience in Data ingestion tools like Streamsets, Aspera, IOT Hub/Even Hub, Kafka
- Knowledge and experience in Data visualization and other tools like Power BI, Alteryx
- Experience in building application deployment automation using CI/CD tools and other industry standard deployment and configuration tools.
- Solid experience with D&A technologies, architectures and patterns.
- In depth knowledge and experience with D&A solutions, reference architectures, DevOps models, Security, Scripting, Automation
- Demonstrated capacity to use standard modelling approaches, tools and model repositories
- Highly motivated and technically savvy
- Experience working in an Agile environment
- Substantial experience in IT roles such as Implementation, operations, consultancy, business analysis & D&A architecture
- A university degree, preferably in Information Technology, Information Management, Software Engineering, Statistics, Mathematics or related field
- Holds PMP, PRINCE, SCRUM or Big Data Analytics related certifications
- At least 8 years' experience in roles where they held similar accountabilities to those listed above -with enterprise scale' solutions
- Strong interpersonal/communication skills, both oral and written. Good storytelling capabilities