CareerAddict

DevOps Engineer

Red - The Global SAP Solutions Provider

Posted on May 7, 2025 by Red - The Global SAP Solutions Provider
Baltimore, MD, 21201
IT
Immediate Start
Annual Salary
Full-Time

Role Purpose

We are a leader in wireless manufacturing and retail solutions, is seeking a highly skilled Cloud BI Developer to join our team. The ideal candidate will have strong expertise in cloud-based data platforms such as Azure or AWS, including tools like Synapse Analytics or Redshift, Databricks or EMR, Python, PySpark, and cloud-native data integration services (eg, Azure Data Factory or AWS Glue). In this role, you will be responsible for designing, developing, and maintaining scalable data solutions while ensuring efficient ETL/ELT processes, robust data governance, and performance optimization across cloud environments.

Job Description
* Design, build, and maintain data pipelines and solutions using Azure and/or AWS cloud technologies.
* Develop and optimize scalable data solutions using tools such as Azure Synapse Analytics or Amazon Redshift, Databricks or AWS EMR, Python, PySpark, and Azure Data Factory (ADF) or AWS Glue.
* Design, develop, and maintain ETL/ELT pipelines for efficient data processing and transformation.
* Implement Data Lake architecture using Delta Lake, Amazon S3, Azure Data Lake, or equivalent, for managing structured and unstructured data.
* Enforce and manage data security, governance, and compliance in cloud environments (eg, Azure Purview, AWS Lake Formation, IAM roles, etc.).
* Collaborate with cross-functional teams to integrate data insights and deliver impactful solutions that support business goals.
* Implement DevOps/CI-CD processes using tools such as Azure DevOps, GitHub Actions, or AWS Code Pipeline/Code Build to streamline development workflows.
* Leverage expertise in Synapse Analytics, Redshift, or Snowflake for large-scale data integration and analytics workloads.
* Work hands-on with Databricks (on Azure or AWS) or EMR for big data processing, analytics, and machine learning pipelines.
* Provide guidance on best practices for performance optimization, monitoring, and troubleshooting.

Skills Required
* Proficiency in Python for data processing, automation, and transformation.
* Extensive experience with PySpark for distributed data processing.
* Strong knowledge of SparkSQL for querying and managing big data.
* Proven hands-on expertise with tools like Azure Synapse Analytics, Amazon Redshift, Databricks, Azure Data Factory (ADF), and AWS Glue for building ETL/ELT pipelines
* Strong understanding of Data Lake architecture and Delta Lake principles, with experience on platforms such as Azure Data Lake Storage (ADLS) or Amazon S3
* Knowledge of data security, governance, and compliance standards in cloud environments, including tools like Azure Purview, AWS Lake Formation, and IAM policies
* Expertise in setting up CI/CD pipelines using Azure DevOps, GitHub Actions, or AWS Code Pipeline/Code Build for workflow automation
* 2+ years of experience in implementing automated data engineering processes in either Azure or AWS environments
* Cloud certifications (eg, Azure Data Engineer Associate, AWS Certified Data Analytics - Specialty, or similar) are a plus
* Ability to work onsite


Reference: 2944751773

https://jobs.careeraddict.com/post/103267068

This Job Vacancy has Expired!

Red - The Global SAP Solutions Provider

DevOps Engineer

Red - The Global SAP Solutions Provider

Posted on May 7, 2025 by Red - The Global SAP Solutions Provider

Baltimore, MD, 21201
IT
Immediate Start
Annual Salary
Full-Time

Role Purpose

We are a leader in wireless manufacturing and retail solutions, is seeking a highly skilled Cloud BI Developer to join our team. The ideal candidate will have strong expertise in cloud-based data platforms such as Azure or AWS, including tools like Synapse Analytics or Redshift, Databricks or EMR, Python, PySpark, and cloud-native data integration services (eg, Azure Data Factory or AWS Glue). In this role, you will be responsible for designing, developing, and maintaining scalable data solutions while ensuring efficient ETL/ELT processes, robust data governance, and performance optimization across cloud environments.

Job Description
* Design, build, and maintain data pipelines and solutions using Azure and/or AWS cloud technologies.
* Develop and optimize scalable data solutions using tools such as Azure Synapse Analytics or Amazon Redshift, Databricks or AWS EMR, Python, PySpark, and Azure Data Factory (ADF) or AWS Glue.
* Design, develop, and maintain ETL/ELT pipelines for efficient data processing and transformation.
* Implement Data Lake architecture using Delta Lake, Amazon S3, Azure Data Lake, or equivalent, for managing structured and unstructured data.
* Enforce and manage data security, governance, and compliance in cloud environments (eg, Azure Purview, AWS Lake Formation, IAM roles, etc.).
* Collaborate with cross-functional teams to integrate data insights and deliver impactful solutions that support business goals.
* Implement DevOps/CI-CD processes using tools such as Azure DevOps, GitHub Actions, or AWS Code Pipeline/Code Build to streamline development workflows.
* Leverage expertise in Synapse Analytics, Redshift, or Snowflake for large-scale data integration and analytics workloads.
* Work hands-on with Databricks (on Azure or AWS) or EMR for big data processing, analytics, and machine learning pipelines.
* Provide guidance on best practices for performance optimization, monitoring, and troubleshooting.

Skills Required
* Proficiency in Python for data processing, automation, and transformation.
* Extensive experience with PySpark for distributed data processing.
* Strong knowledge of SparkSQL for querying and managing big data.
* Proven hands-on expertise with tools like Azure Synapse Analytics, Amazon Redshift, Databricks, Azure Data Factory (ADF), and AWS Glue for building ETL/ELT pipelines
* Strong understanding of Data Lake architecture and Delta Lake principles, with experience on platforms such as Azure Data Lake Storage (ADLS) or Amazon S3
* Knowledge of data security, governance, and compliance standards in cloud environments, including tools like Azure Purview, AWS Lake Formation, and IAM policies
* Expertise in setting up CI/CD pipelines using Azure DevOps, GitHub Actions, or AWS Code Pipeline/Code Build for workflow automation
* 2+ years of experience in implementing automated data engineering processes in either Azure or AWS environments
* Cloud certifications (eg, Azure Data Engineer Associate, AWS Certified Data Analytics - Specialty, or similar) are a plus
* Ability to work onsite

Reference: 2944751773

CareerAddict

Alert me to jobs like this:

Amplify your job search:

CV/résumé help

Increase interview chances with our downloads and specialist services.

CV Help

Expert career advice

Increase interview chances with our downloads and specialist services.

Visit Blog

Job compatibility

Increase interview chances with our downloads and specialist services.

Start Test