Senior AWS Data Engineer- REMOTE
Posted on Sep 28, 2021 by NextLink Solutions
We have an interesting requirement from one of our clients based in Switzerland- REMOTE. Please find the job details below and it would be great if you are interested to proceed.
NextLink Solutions is currently looking for a Senior AWS Data Engineer- REMOTE. You will integrate a multicultural team of passionate experts that will support and integrate you fully as a member of the department.
Your development knowledge will be valued and challenged in various projects, and your knowledge and strategic thinking will be key to the success of the team
Start Date: November/December
End Date: 4months +
Work location: Kaiseraugst
Remote/Home Office: Home Office Yes/Full remote acceptable in CET time zone
Note: EU national/work permit holder Or Swiss local are eligible to apply
Tasks & responsibilities:
The role will be affiliated with the PD Quality Sherlock' project, which will leverage the cloud based PD Advanced Analytics (PDAA) platform to generate actionable insights that will lead to a data-driven auditing process.
Tasks and Responsibilities:
Ability to work within an Agile Scrum Team to perform specific development and configuration tasks according to Jira based User Stories/Functional Specifications
Collaborate with the Tech Lead and Business Analyst to define Functional Specification and implement (code or configure) solution based on Functional Specifications
Actively participate in all Agile Ceremonies (Sprint Planning, Story Estimations, Retrospectives, Daily Scrums, etc.)
Translate an architecture design into a functional solution
Document Technical & Design Specifications
Main Skills :
Min. 5 + years experience in Cloud (Platform) Data Engineering (*)
Certified and proficient in Agile ways of working with cross-functional teams
AWS Developer Certification with extensive hands-on experience with core AWS services such as Glue/Glue Studio, S3, Lambda, SageMaker, DynamoDB, Athena, Redshift, RDS, Step Functions, CloudWatch, EC2, SNS, SQS, etc. (*)
Experience in building AWS based data ingestion, processing and analytics pipelines (*)
Substantial knowledge of infrastructure as code' and in particular with Terraform and AWS CLI (*)
CI/CD systems (preferably GitLab and Jenkins) experience, including configuration management and automation tools (preferably Ansible and Puppet) (*)
Hands-on experience with Python coding and Bash Shell Scripting (*)
Nice to Haves:
Former Client experience or in the life sciences/pharma industry
Knowledge of Service Orchestration using Kubernetes and/or EKS
Knowledge of various databases, both SQL, and NoSQL
AWS CloudFormation knowledge
Experience with Snowflake
Knowledge of more specialized languages such as R'
Experience with troubleshooting performance bottlenecks, considering both Linux, as well as Cloud Native environments
Knowledge of IP Networking, VPN's, DNS, load balancing and Firewall.
Experience collaborating on code base with Git & experience with the Git branching strategies
Experience liaising with Data Scientist and DevOps team along the technological stacks typical for their