Lead Big Data Engineer
Posted on Apr 7, 2021 by US Cellular
Lead Big Data Engineer - INF001430
The Lead Big Data Engineer is the Subject Matter Expert (SME) for Hadoop and Kafka administration in order to establish, install, maintain, monitor, troubleshoot and train others in these technologies. Ideally, this person will also have the desire to learn and become proficient in AWS, EDB Postgres, Citus and MongoDB technologies within our cross functional team. This role focuses on driving standards and automation to enable developers to become self-sufficient where possible. Additional functions are to drive quality improvement practices to ensure our technology is functioning at an optimal performance level, flexible for business needs and can scale for growth needs. Lastly, the Lead Big Data Engineer focuses on facilitating cross training practices for this team as well as others who wish to learn more about how to use these technologies. Of note, this role and team operates in more of an administration/engineering capacity than in a development capacity with these technologies.
- Drive strategy to integrate data and Interface with vendors operational team to manage Customer Experience Platform data lake.
- Analyze, identify, and determine the existing processes for capturing and updating data elements.
- Develop logical and physical data models for Big Data platforms.
- Provide guidance and set standards for teams to write data pipelines and build data repositories using current technologies
- Monitor and optimize performance of Big Data platforms.
- Research/sustain competency relevant to current technology to maintain and/or improve functionality for Engineering organization's Big Data applications.
- Represent both IS and Engineering teams in architecture and solution discussions related to Big Data systems.
- an and Coordinate ongoing maintenance and enhancements to existing Big Data systems.
- Bachelor's degree in Engineering, Computer Science, Information Technology or related discipline, or equivalent work or military experience.
- Minimum 8 years of experience in data, categorization, and policy related business positions.
- 3 years' experience with Big data related principles, practices, and procedures.
- 3 years of Data Management experience (SQL, NoSQL, ETL and meta data management).
- Experience working with application developers to understand their needs in order to help them to access Big Data platforms.
- Experience with Agile methodologies including Scrum.
- Experience with managing Big Data Platforms (Hadoop - HDFS, Impala, Hive, Hbase, Spark, Yarn, Hue, KTS, Sentry, ZooKeeper, etc.)
- Experience with partitioning within Hadoop
- Experience with creating, managing and supporting event driven architecture (Kafka)
- Experience with managing EDB/Citus (Postgres or MongoDB)
- Experience with managing AWS and supporting it
- Experience with RESTful API, Services.
- Experience with Linux systems and scripting languages.
- Experience with functional programming (Scala, Python).
Job : Information Technology
Location(s) : Illinois-SCHAUMBURG_IL
U.S. Cellular® is an EEO employer and gives consideration to qualified applicants without regard to race/color/age/religion/sex/sexual orientation/gender identity/national origin/disability/veteran status, pregnancy or genetic information.