CareerAddict

Data Engineer - PySpark - Airflow - Unix - Docker/Kubernetes - CI/CD - Finland

Empiric Solutions

Posted on Jul 2, 2025 by Empiric Solutions
Helsinki, Uusimaa, Finland
IT
1 Jul 2025
Daily Salary
Contract/Project

Data Engineer - PySpark - Airflow - Unix - Docker/Kubernetes - CI/CD - Finland

Empiric has received an exciting opportunity for an Data Engineer with good experience in PySpark, Airflow and proficiency in Unix. Docker and Kubernetes experience are also required.

The Data Engineer will troubleshoot data pipelines and address issues with Real Time, batch processing.

Plus Develop ETL process and data flows using PySpark and Airflow. Maintain and troubleshoot CI/CD pipeline issues using GitHub Actions and JFrog

Hybrid 6 to 10 days in the clients Helsinki office initially. Can be reduced in time

Skills/Experience:

  • PySpark & Airlow
  • Unix
  • Docker & Kubernetes
  • CI/CD Tools: GitHub Actions
  • Maintenance & troubleshooting
  • Understanding & ensuring data quality, integrity, governance
  • Term: 6 to 18 Months plus extensions
  • Good day rate (B2B is fine) + Starter Bonus + Free Lunch Club experience

This is a critical position, please respond to this advert or reach out to Woody on either (see below) or (whatsapp is fine) for a confidential chat and more details on rate and this terrific project.


Reference: 2974518542

https://jobs.careeraddict.com/post/104825346

This Job Vacancy has Expired!

Empiric Solutions

Data Engineer - PySpark - Airflow - Unix - Docker/Kubernetes - CI/CD - Finland

Empiric Solutions

Posted on Jul 2, 2025 by Empiric Solutions

Helsinki, Uusimaa, Finland
IT
1 Jul 2025
Daily Salary
Contract/Project

Data Engineer - PySpark - Airflow - Unix - Docker/Kubernetes - CI/CD - Finland

Empiric has received an exciting opportunity for an Data Engineer with good experience in PySpark, Airflow and proficiency in Unix. Docker and Kubernetes experience are also required.

The Data Engineer will troubleshoot data pipelines and address issues with Real Time, batch processing.

Plus Develop ETL process and data flows using PySpark and Airflow. Maintain and troubleshoot CI/CD pipeline issues using GitHub Actions and JFrog

Hybrid 6 to 10 days in the clients Helsinki office initially. Can be reduced in time

Skills/Experience:

  • PySpark & Airlow
  • Unix
  • Docker & Kubernetes
  • CI/CD Tools: GitHub Actions
  • Maintenance & troubleshooting
  • Understanding & ensuring data quality, integrity, governance
  • Term: 6 to 18 Months plus extensions
  • Good day rate (B2B is fine) + Starter Bonus + Free Lunch Club experience

This is a critical position, please respond to this advert or reach out to Woody on either (see below) or (whatsapp is fine) for a confidential chat and more details on rate and this terrific project.

Reference: 2974518542

CareerAddict

Alert me to jobs like this:

Amplify your job search:

CV/résumé help

Increase interview chances with our downloads and specialist services.

CV Help

Expert career advice

Increase interview chances with our downloads and specialist services.

Visit Blog

Job compatibility

Increase interview chances with our downloads and specialist services.

Start Test

Similar Jobs

IBM Mainframe Developer

Helsinki, Uusimaa, Finland