Cloud Data Architect (GCP)

Future Processing
Full_timezł165-245/hour (PLN)

📍 Job Overview

  • Job Title: Cloud Data Architect (GCP)
  • Company: Future Processing
  • Location: Poland (remote work)
  • Job Type: B2B Contractor
  • Category: Data Architecture & Engineering
  • Date Posted: 2025-02-03
  • Experience Level: Mid-Senior level (5-10 years)
  • Remote Status: Remote OK

🚀 Role Summary

  • Lead data architecture and engineering projects using Google Cloud Platform (GCP) services
  • Collaborate with clients to understand their data needs and propose optimal GCP solutions
  • Design, implement, and manage scalable data pipelines and platforms using GCP services
  • Mentor junior team members and contribute to the growth of the Data Solutions line of business

📝 Enhancement Note: This role requires a strong background in GCP data services and a proven track record of delivering commercial projects. The ideal candidate will have a strategic mindset and excellent communication skills to work effectively with clients and team members.

💻 Primary Responsibilities

  • Client Consultation & Solution Design: Collaborate with clients to understand their data needs and propose optimal GCP solutions, including data architecture, schema design, and cost estimation.
  • Data Pipeline & Platform Development: Design, implement, and manage scalable data pipelines and platforms using GCP services such as BigQuery, Dataflow, Dataproc, and Cloud Composer.
  • Data Governance & Security: Ensure data governance best practices are followed, including data cataloging, metadata management, and data security measures using GCP services like VPC Service Controls, CMEK, and DLP.
  • Mentoring & Knowledge Sharing: Mentor junior team members, contribute to their professional development, and share knowledge through presentations and workshops.
  • Project Management & Stakeholder Communication: Coordinate project timelines, resources, and deliverables, and maintain open communication with clients and internal stakeholders.

📝 Enhancement Note: The role requires a balance of technical depth and breadth, as well as strong communication and leadership skills to drive projects to successful completion.

🎓 Skills & Qualifications

Education: Bachelor's degree in Computer Science, Engineering, or a related field. A Master's degree would be an asset.

Experience: At least 5 years of experience in data architecture, data engineering, or a related role, with a focus on GCP services.

Required Skills:

  • Proven expertise in GCP data services (BigQuery, BigLake, Dataflow, Dataproc, Pub/Sub, Cloud Composer, Dataplex, Senstive Data Protection, Looker, Vertex AI)
  • Strong SQL and Python skills
  • Experience with CI/CD, IaC (Terraform and/or Cloud Deploy/Cloud Build), and Git
  • Knowledge of data modeling, data warehousing, and data lake concepts
  • Experience with Lambda and Kappa architectures, and implementing near-real-time solutions
  • Familiarity with monitoring, diagnostics, and troubleshooting in GCP
  • Experience with data migration, including lift-and-shift, re-platform, and re-architect approaches
  • Knowledge of data security best practices and GCP security services
  • Proven ability to understand and translate business requirements into technical solutions
  • Excellent communication and presentation skills in English (C1 level)

Preferred Skills:

  • Experience with Apache Beam and Dataproc for batch and streaming processing
  • Familiarity with FinOps, GCP Pricing Calculator, and BigQuery Reservations
  • Knowledge of data governance frameworks and best practices
  • Experience with cloud migration tools and processes
  • Familiarity with GCP partner ecosystem and integration capabilities

📝 Enhancement Note: The ideal candidate will have a strong foundation in GCP data services and a proven track record of delivering commercial projects. Relevant certifications, such as Google Cloud Certified - Professional Data Engineer, would be an asset.

📊 Web Portfolio & Project Requirements

Portfolio Essentials:

  • A well-structured portfolio showcasing GCP data projects, including data pipeline designs, architecture diagrams, and performance metrics
  • Live demos or case studies demonstrating the implementation of Lambda and Kappa architectures, near-real-time solutions, and data governance best practices
  • Examples of successful client presentations and workshops, highlighting communication and leadership skills

Technical Documentation:

  • Well-documented code and scripts, with clear comments and version control using Git
  • Detailed project documentation, including data modeling, schema design, and data flow diagrams
  • Performance testing and optimization reports, demonstrating an understanding of data pipeline efficiency and cost optimization

📝 Enhancement Note: A strong portfolio will demonstrate the candidate's ability to design, implement, and manage scalable data pipelines and platforms using GCP services, as well as their communication and leadership skills.

💵 Compensation & Benefits

Salary Range: The salary range for this role is 165-245 PLN per hour (netto + VAT, B2B). This estimate is based on market research for mid-senior level data architecture and engineering roles in Poland, with a focus on GCP expertise.

Benefits:

  • Remote work opportunities
  • Competitive salary and benefits package
  • Opportunities for professional growth and development within the Data Solutions line of business
  • A dynamic and collaborative work environment with a strong focus on knowledge sharing and continuous learning

Working Hours: The standard workweek is 40 hours, with flexible hours and remote work arrangements available. The role may require occasional travel for client meetings and workshops.

📝 Enhancement Note: The salary range for this role is competitive with market standards for mid-senior level data architecture and engineering roles in Poland, with a focus on GCP expertise. Benefits and working hours are subject to the terms and conditions of the employment contract.

🎯 Team & Company Context

🏢 Company Culture

Industry: Future Processing is a technology company specializing in software development, data engineering, and cloud services. The company operates in the IT and software industry, with a strong focus on emerging technologies and innovative solutions.

Company Size: Future Processing is a medium-sized company with a team of over 500 employees. This size allows for a collaborative and agile work environment, with opportunities for professional growth and development.

Founded: Future Processing was founded in 2000, with a strong track record of delivering successful projects for clients in various industries.

Team Structure:

  • The Data Solutions team is responsible for data engineering, data architecture, and data analytics projects using GCP services.
  • The team consists of data engineers, data architects, data analysts, and project managers, working collaboratively to deliver client projects.
  • The team reports to the Data Solutions Director, who is responsible for the strategic direction and growth of the line of business.

Development Methodology:

  • The team follows Agile methodologies, including Scrum and Kanban, to manage project timelines, resources, and deliverables.
  • Code reviews, testing, and quality assurance practices are integral to the development process.
  • CI/CD pipelines and automated deployment strategies are used to ensure efficient and reliable software delivery.

Company Website: Future Processing

📝 Enhancement Note: Future Processing is a technology company with a strong focus on emerging technologies and innovative solutions. The company's culture values collaboration, knowledge sharing, and continuous learning, providing an ideal environment for data architecture and engineering professionals to grow and develop their careers.

📈 Career & Growth Analysis

Web Technology Career Level: This role is at the mid-senior level, requiring a strong foundation in GCP data services and a proven track record of delivering commercial projects. The ideal candidate will have 5-10 years of experience in data architecture, data engineering, or a related role, with a focus on GCP services.

Reporting Structure: The role reports to the Data Solutions Director, who is responsible for the strategic direction and growth of the line of business. The role may also involve mentoring and guiding junior team members, contributing to their professional development.

Technical Impact: The role has a significant impact on the design, implementation, and management of data pipelines and platforms using GCP services. The ideal candidate will be able to drive projects to successful completion, ensuring optimal performance, scalability, and cost efficiency.

Growth Opportunities:

  • Technical Specialization: Deepen expertise in specific GCP data services or emerging technologies, such as Apache Beam, Dataproc, or BigQuery Reservations.
  • Technical Leadership: Develop leadership skills through mentoring, knowledge sharing, and project management, with the potential to take on more senior roles within the Data Solutions line of business.
  • Business Development: Contribute to the growth of the Data Solutions line of business by participating in client presentations, workshops, and proposals, and by identifying new opportunities for GCP data services.

📝 Enhancement Note: This role offers significant opportunities for professional growth and development within the Data Solutions line of business. The ideal candidate will be proactive, strategic, and focused on driving projects to successful completion, while also contributing to the growth of the team and the company.

🌐 Work Environment

Office Type: Future Processing has offices in Poland, but the role can be performed remotely, with occasional travel for client meetings and workshops.

Office Location(s): The company's headquarters are in Wrocław, Poland, with additional offices in Warsaw, Kraków, and Opole. Remote work is an option for this role, with occasional travel required.

Workspace Context:

  • Remote Work: The role can be performed remotely, with a focus on collaboration and communication using tools such as Google Workspace, Slack, and Jira.
  • Collaboration: The role involves close collaboration with clients and internal stakeholders, including data engineers, data analysts, and project managers.
  • Development Tools: The team uses a range of development tools, including BigQuery, Dataflow, Dataproc, Cloud Composer, and Apache Beam. Familiarity with these tools is essential for the role.

Work Schedule: The standard workweek is 40 hours, with flexible hours and remote work arrangements available. The role may require occasional travel for client meetings and workshops.

📝 Enhancement Note: The work environment for this role is flexible and collaborative, with a strong focus on knowledge sharing and continuous learning. The role can be performed remotely, with occasional travel required for client meetings and workshops.

📄 Application & Technical Interview Process

Interview Process:

  1. Phone or Video Screen: A brief conversation to assess communication skills, cultural fit, and initial technical competencies.
  2. Technical Assessment: A hands-on assessment of GCP data services, including data modeling, pipeline design, and performance optimization.
  3. Client Presentation: A presentation to a panel of Future Processing team members and clients, demonstrating the ability to understand and translate business requirements into technical solutions.
  4. Final Interview: A discussion of the candidate's career goals, expectations, and fit within the Data Solutions team.

Portfolio Review Tips:

  • Highlight GCP data projects that demonstrate a strong understanding of data architecture, data engineering, and data governance best practices.
  • Include live demos or case studies that showcase the implementation of Lambda and Kappa architectures, near-real-time solutions, and data governance best practices.
  • Emphasize communication and leadership skills through client presentations and workshops.

Technical Challenge Preparation:

  • Brush up on GCP data services, including BigQuery, Dataflow, Dataproc, and Cloud Composer.
  • Review data modeling, data warehousing, and data lake concepts, as well as Lambda and Kappa architectures.
  • Familiarize yourself with monitoring, diagnostics, and troubleshooting in GCP, as well as data security best practices.

ATS Keywords: [See the ATS Keywords section below]

📝 Enhancement Note: The interview process for this role is designed to assess the candidate's technical competencies, communication skills, and cultural fit within the Data Solutions team. The ideal candidate will have a strong foundation in GCP data services and a proven track record of delivering commercial projects.

🛠 Technology Stack & Web Infrastructure

GCP Data Services:

  • BigQuery: A fully-managed, serverless data warehouse service for analytics and machine learning. BigQuery enables scalable analysis over petabytes of data using SQL-like queries.
  • Dataflow: A fully-managed service for executing Apache Beam pipelines within the GCP data ecosystem. Dataflow supports both batch and streaming data processing.
  • Dataproc: A fully-managed Hadoop service for running Apache Spark, Apache Hadoop, and Apache Hive clusters in the cloud. Dataproc is ideal for batch and interactive processing of large datasets.
  • Cloud Composer: A fully-managed workflow orchestration service for Apache Airflow. Cloud Composer enables the creation, scheduling, and monitoring of workflows in the cloud.
  • Pub/Sub: A real-time messaging service that allows you to send and receive messages between independent applications. Pub/Sub is ideal for building event-driven architectures and real-time data pipelines.
  • Cloud Storage: A scalable, secure, and durable object storage service for files and objects. Cloud Storage is ideal for storing and serving website content, images, videos, and user-uploaded files.
  • Cloud Functions: A serverless execution environment for building and connecting cloud services and applications. Cloud Functions enables event-driven computing and microservices architectures.

Apache Beam: An open-source, unified programming model for defining both batch and streaming data-parallel processing pipelines. Apache Beam is used to define the data processing logic for Dataflow pipelines.

Terraform: An open-source infrastructure as code (IaC) software tool for creating, changing, and versioning infrastructure safely and efficiently. Terraform is used to manage the infrastructure required to run GCP data services.

📝 Enhancement Note: The technology stack for this role is focused on GCP data services, with an emphasis on BigQuery, Dataflow, Dataproc, and Cloud Composer. Familiarity with these services is essential for the role, as well as an understanding of Apache Beam, Terraform, and other relevant tools.

👥 Team Culture & Values

Data Solutions Values:

  • Client Focus: A strong commitment to understanding and meeting the data needs of our clients, with a focus on delivering optimal solutions using GCP data services.
  • Technical Excellence: A dedication to staying up-to-date with the latest GCP data services and best practices, with a focus on continuous learning and improvement.
  • Collaboration: A commitment to working closely with clients and internal stakeholders, with a focus on knowledge sharing and teamwork.
  • Innovation: A willingness to explore new GCP data services and emerging technologies, with a focus on driving projects to successful completion.

Collaboration Style:

  • Cross-Functional Integration: The Data Solutions team works closely with other teams within Future Processing, including software development, quality assurance, and project management, to deliver successful client projects.
  • Code Review Culture: The team follows best practices for code reviews, testing, and quality assurance to ensure the delivery of high-quality GCP data services.
  • Knowledge Sharing: The team encourages knowledge sharing and continuous learning, with a focus on mentoring and guiding junior team members.

📝 Enhancement Note: The Data Solutions team at Future Processing values collaboration, knowledge sharing, and continuous learning, providing an ideal environment for data architecture and engineering professionals to grow and develop their careers.

⚡ Challenges & Growth Opportunities

Technical Challenges:

  • Scalability: Designing and implementing scalable data pipelines and platforms using GCP data services, with a focus on performance, cost efficiency, and fault tolerance.
  • Data Governance: Ensuring data governance best practices are followed, including data cataloging, metadata management, and data security measures using GCP services.
  • Emerging Technologies: Staying up-to-date with the latest GCP data services and emerging technologies, such as Apache Beam, Dataproc, or BigQuery Reservations.

Learning & Development Opportunities:

  • Technical Training: Participating in technical training and certification programs, such as Google Cloud Certified - Professional Data Engineer, to deepen expertise in GCP data services.
  • Conferences & Events: Attending industry conferences and events, such as Google Cloud Next, to stay up-to-date with the latest GCP data services and best practices.
  • Mentoring & Knowledge Sharing: Mentoring junior team members and contributing to their professional development, as well as learning from other data architecture and engineering professionals within the team.

📝 Enhancement Note: The technical challenges and learning opportunities for this role are focused on GCP data services, with an emphasis on scalability, data governance, and emerging technologies. The ideal candidate will be proactive, strategic, and focused on driving projects to successful completion, while also contributing to the growth of the team and the company.

💡 Interview Preparation

Technical Questions:

  • GCP Data Services: Questions related to the design, implementation, and management of GCP data services, including BigQuery, Dataflow, Dataproc, and Cloud Composer.
  • Data Modeling & Architecture: Questions related to data modeling, schema design, and data architecture, with a focus on GCP data services.
  • Data Governance & Security: Questions related to data governance best practices, data security measures, and GCP security services.
  • Monitoring & Troubleshooting: Questions related to monitoring, diagnostics, and troubleshooting in GCP, with a focus on data pipeline performance and cost optimization.

Company & Culture Questions:

  • Client Focus: Questions related to understanding and meeting the data needs of clients, with a focus on delivering optimal solutions using GCP data services.
  • Collaboration: Questions related to working closely with clients and internal stakeholders, with a focus on knowledge sharing and teamwork.
  • Innovation: Questions related to exploring new GCP data services and emerging technologies, with a focus on driving projects to successful completion.

Portfolio Presentation Strategy:

  • GCP Data Projects: Highlight GCP data projects that demonstrate a strong understanding of data architecture, data engineering, and data governance best practices.
  • Live Demos & Case Studies: Include live demos or case studies that showcase the implementation of Lambda and Kappa architectures, near-real-time solutions, and data governance best practices.
  • Communication & Leadership: Emphasize communication and leadership skills through client presentations and workshops, with a focus on understanding and translating business requirements into technical solutions.

📝 Enhancement Note: The interview process for this role is designed to assess the candidate's technical competencies, communication skills, and cultural fit within the Data Solutions team. The ideal candidate will have a strong foundation in GCP data services and a proven track record of delivering commercial projects.

📌 Application Steps

To apply for this Cloud Data Architect (GCP) position at Future Processing:

  1. Submit Your Application: Click the "Apply" button on the job listing and follow the instructions to submit your resume and portfolio.
  2. Prepare Your Portfolio: Tailor your portfolio to highlight GCP data projects that demonstrate a strong understanding of data architecture, data engineering, and data governance best practices. Include live demos or case studies that showcase your implementation of Lambda and Kappa architectures, near-real-time solutions, and data governance best practices.
  3. Optimize Your Resume: Highlight your relevant experience with GCP data services, data modeling, and data architecture, as well as your communication and leadership skills. Use relevant keywords to optimize your resume for ATS systems.
  4. Prepare for Technical Interviews: Brush up on your GCP data services knowledge, data modeling concepts, and data governance best practices. Practice your communication and presentation skills, with a focus on understanding and translating business requirements into technical solutions.
  5. Research Future Processing: Learn about Future Processing's industry, company size, and team structure, as well as the Data Solutions line of business. Understand how the role fits within the organization and how you can contribute to its growth and success.

⚠️ Important Notice: This enhanced job description includes AI-generated insights and web technology industry-standard assumptions. All details should be verified directly with the hiring organization before making application decisions.

📌 ATS Keywords

Programming Languages:

  • SQL
  • Python
  • Apache Beam

Web Frameworks & Libraries:

  • BigQuery
  • Dataflow
  • Dataproc
  • Cloud Composer
  • Apache Beam

Server Technologies:

  • GCP (Google Cloud Platform)
  • BigQuery
  • Dataflow
  • Dataproc
  • Cloud Composer
  • Pub/Sub
  • Cloud Storage
  • Cloud Functions

Databases:

  • BigQuery
  • Cloud Spanner
  • Firestore

Tools:

  • Terraform
  • Git
  • CI/CD (Continuous Integration/Continuous Deployment)
  • IaC (Infrastructure as Code)
  • Monitoring & Logging (Cloud Monitoring, Cloud Logging, Error Reporting, Cloud Trace)
  • FinOps (Cloud Cost Management)
  • GCP Pricing Calculator
  • BigQuery Reservations

Methodologies:

  • Agile (Scrum, Kanban)
  • CI/CD (Continuous Integration/Continuous Deployment)
  • IaC (Infrastructure as Code)
  • DevOps
  • Data Governance
  • Data Migration (Lift-and-Shift, Re-Platform, Re-Architect)
  • FinOps (Cloud Cost Management)

Soft Skills:

  • Communication (Written, Verbal, Presentation)
  • Leadership (Mentoring, Knowledge Sharing, Project Management)
  • Teamwork (Collaboration, Knowledge Sharing, Cross-Functional Integration)
  • Problem-Solving (Troubleshooting, Root Cause Analysis, Performance Optimization)
  • Adaptability (Learning, Innovation, Emerging Technologies)
  • Strategic Thinking (Business Requirements, Technical Solutions, Cost Optimization)

Industry Terms:

  • Data Architecture
  • Data Engineering
  • Data Governance
  • Data Migration
  • Data Warehousing
  • Data Lake
  • Data Lakehouse
  • Big Data
  • Lambda Architecture
  • Kappa Architecture
  • Near-Real-Time
  • Event-Driven Architecture
  • Microservices Architecture
  • Serverless Architecture
  • Cloud-Native Architecture
  • Data Mesh
  • Data Fabric
  • Data Governance Policy as Code (DGPaC)
  • Data Lineage
  • Data Observability
  • Data Quality
  • Data Provenance
  • Data Catalog
  • Metadata Management
  • Data Security
  • Data Privacy
  • Data Protection
  • Data Encryption
  • Data Masking
  • Data Anonymization
  • Data Tokenization
  • Data Classification
  • Data Labeling
  • Data Preprocessing
  • Data Transformation
  • Data Integration
  • Data Warehousing
  • Data Marts
  • Data Vaults
  • Data Lakes
  • Data Swamps
  • Data Silos
  • Data Governance
  • Data Management
  • Data Operations
  • Data Quality Management
  • Data Lineage Management
  • Data Governance Management
  • Data Security Management
  • Data Privacy Management
  • Data Protection Management
  • Data Compliance Management
  • Data Governance Policy Management
  • Data Governance Policy as Code (DGPaC)
  • Data Governance Policy Enforcement
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data Governance Policy Reporting
  • Data Governance Policy Analytics
  • Data Governance Policy Visualization
  • Data Governance Policy Automation
  • Data Governance Policy Orchestration
  • Data Governance Policy Lifecycle Management
  • Data Governance Policy Version Control
  • Data Governance Policy Access Control
  • Data Governance Policy Auditing
  • Data Governance Policy Monitoring
  • Data Governance Policy Alerting
  • Data Governance Policy Remediation
  • Data Governance Policy Enforcement
  • Data Governance Policy Compliance
  • Data

Application Requirements

Candidates should have at least 5 years of experience with Google Cloud Platform and a strong understanding of data services. Proficiency in SQL and Python, as well as experience in mentoring and presenting, is also required.