This Job Vacancy has Expired!

GCP Data Architect

Gazelle Global Consulting

Posted on Jun 27, 2021 by Gazelle Global Consulting

Not Specified, Netherlands
Immediate Start
£80k - £80k Annual


Hope you are doing great!

We have an opportunity for "GCP Data Architect" with our client.

If you are open for a change, please send your profile so we can have detail discussion for this opportunity.

GCP Data Architect Amsterdam, NL

Key responsibilities may include:

  • As a lead architect, work with implementation teams from concept to operations, providing deep technical subject matter expertise for successfully deploying large scale data solutions in the enterprise, using modern data/analytics technologies on premise and cloud

  • Through a formal sales process, work with the Sales team to identify and qualify Google Cloud Platform (GCP) opportunities.

  • Lead cloud solution and scoping to generate estimates and approaches for proposals and SOWs for customers.

  • Create detailed target state technical, security, data and operational architecture and design blueprints incorporating modern data technologies and cloud data services demonstrating modernization value proposition

  • Lead detail technical assessments of current state of enterprise data and architect a path to transformation into a modern data powered enterprise

  • Demonstrate experience presenting technical topics in front of an audience

  • Conduct full technical discovery, identifying pain points, business and technical requirements, "as is" and "to be" scenarios.

  • Compare solution alternatives across both technical and business parameters which support the define cost and service requirements.

  • Apply Methodology, reusable assets, and previous work experience to deliver consistently high-quality work.

  • Deliver written or oral status reports regularly.

  • Stay educated on new and emerging technologies/patterns/methodologies and market offerings that may be of interest to our clients.

  • Adapt to existing methods and procedures to create possible alternative solutions to moderately complex problems.

Roles & Responsibilites:

  • Minimum 10+ years of technical solutions implementation, architecture design, evaluation, and investigation in a cloud environment.

  • Minimum of 5 years architecting, developing, and deploying scalable enterprise data analytics solutions (Enterprise Data Warehouses, Data Marts, etc)

  • Knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments (such as Apache Beam, Hadoop, Spark, Pig, Hive, MapReduce, Flume)

  • Minimum 1 year of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Cloud DataProc, Cloud Dataflow, Apache Beam, BigTable, Cloud BigQuery, Cloud PubSub, Cloud Functions, etc.

  • Minimum 1 year of hands-on experience analysing, re-architecting and re-platforming on-premise data warehouses to data platforms on Google cloud using GCP/3rd party services

  • Minimum 1 year of designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture, using Java, Python, Scala etc.

  • Minimum 1 year of architecting and implementing next generation data and analytics platforms on GCP cloud

  • Minimum 1 year of designing and implementing data engineering, ingestion and curation functions on GCP cloud using GCP native or custom programming

  • Minimum 1 year of experience in performing detail assessments of current state data platforms and creating an appropriate transition path to GCP cloud

  • Bachelor's degree or equivalent (minimum 12 years) work experience. If Associate Degree, must have minimum 6 years work experience

  • Minimum 4+ years of experience in architecting large-scale data solutions, performing architectural assessments, crafting architectural options and analysis, finalizing preferred solution alternative working with IT and Business stakeholders

  • 2+ years of hands-on experience designing and implementing data ingestion solutions on GCP using GCP native services or with 3rd parties such as Talend, Informatica

  • 2+ years of hands-on experience architecting and designing data lakes on GCP cloud serving analytics and BI application integrations

  • Minimum 1+ Experience in designing and optimizing data models on Google cloud using GCP data stores such as BigQuery, Bigtable

  • Architecting and implementing data governance and security for data platforms on GCP

  • Designing operations architecture and conducting performance engineering for large scale data lakes a production environment

  • Craft and lead client design workshops and provide tradeoffs and recommendations towards building solutions

Professional Skill Requirements:

  • Proven ability to build, manage and foster a team-oriented environment

  • Excellent communication (written and oral) and interpersonal skills

  • Excellent leadership and management skills

  • Proven ability to work creatively and analytically in a problem-solving environment.

Reference: 1238898984

Set up alerts to get notified of new vacancies.