Senior Cloud Data Engineer (GCP)
📍 Job Overview
- Job Title: Senior Cloud Data Engineer (GCP)
- Company: Future Processing
- Location: Gliwice, Śląskie, Poland
- Job Type: On-site
- Category: Data Engineering
- Date Posted: 2025-06-25
- Experience Level: 5-10 years
- Remote Status: On-site
🚀 Role Summary
- Lead the design, implementation, and maintenance of cloud data pipelines and data lakes using Google Cloud Platform (GCP)
- Collaborate with cross-functional teams to deliver data-driven solutions and optimize data processing workflows
- Ensure data quality, security, and compliance in accordance with industry standards and best practices
- Mentor junior team members and contribute to the continuous improvement of data engineering processes
📝 Enhancement Note: This role requires a senior-level data engineer with extensive experience in GCP to drive data projects end-to-end and ensure high-quality data solutions.
💻 Primary Responsibilities
- Data Pipeline Development: Design, implement, and maintain efficient, reusable, and reliable ETL/ELT pipelines using GCP services such as BigQuery, Dataflow, Dataproc, and Cloud Composer
- Data Warehouse & Lake Management: Build, maintain, and optimize data warehouses and lakes in BigQuery, ensuring data integrity, accessibility, and performance
- Data Governance: Establish and enforce data governance policies, including data cataloging, metadata management, and access control
- Collaboration & Communication: Work closely with stakeholders, data analysts, and data scientists to understand data requirements, provide technical guidance, and ensure data-driven decision-making
- Problem Solving: Troubleshoot and resolve data-related issues, optimize data processing workflows, and improve data quality
📝 Enhancement Note: This role demands strong problem-solving skills and the ability to work independently and collaboratively to deliver high-quality data solutions.
🎓 Skills & Qualifications
Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Relevant certifications, such as Google Cloud Certified - Professional Data Engineer, are a plus.
Experience: 5+ years of experience in data engineering, with at least 3.5 years focused on GCP. Proven experience in building and maintaining data lakes, data warehouses, and data pipelines.
Required Skills:
- Proficient in SQL and Python for data manipulation and analysis
- Strong knowledge of GCP services, including BigQuery, Dataflow, Dataproc, and Cloud Composer
- Experience with Git and CI/CD pipelines
- Familiarity with data governance principles and best practices
- Excellent communication and collaboration skills
- Fluent in English (B2 level)
Preferred Skills:
- Experience with BigLake, Lakehouse, or Data Mesh architectures
- Knowledge of SMP and MPP architectures
- Familiarity with data migration, data protection (IAM, DLP, GDPR), and FinOps principles
- Experience with data visualization tools, such as Tableau or Power BI
📝 Enhancement Note: Candidates with experience in data migration, data protection, and FinOps will have a competitive advantage in this role.
📊 Web Portfolio & Project Requirements
Portfolio Essentials:
- Include examples of data pipelines, data warehouses, and data lakes you've designed and implemented using GCP
- Highlight your ability to optimize data processing workflows and improve data quality
- Showcase your problem-solving skills by demonstrating how you've addressed data-related challenges in previous projects
Technical Documentation:
- Provide clear and concise documentation for your data projects, including data sources, transformation logic, and target schemas
- Include any relevant scripts, code snippets, or configuration files that demonstrate your technical proficiency
- Ensure your documentation follows best practices and is easy to understand for both technical and non-technical stakeholders
📝 Enhancement Note: Well-structured and well-documented projects will help you stand out and demonstrate your attention to detail.
💵 Compensation & Benefits
Salary Range: The salary range for this role is 135 - 200 PLN per hour (netto + VAT, B2B). The exact salary will depend on the candidate's experience, skills, and qualifications.
Benefits:
- Competitive salary and benefits package
- Opportunity to work with cutting-edge technologies and collaborate with talented team members
- Potential for career growth and professional development within the organization
- Flexible working hours and remote work options may be available, depending on the project and team requirements
Working Hours: The standard working hours for this role are 40 hours per week, with flexibility for project deadlines and maintenance windows.
📝 Enhancement Note: The salary range provided is based on market research and industry standards for senior-level data engineering roles in Poland. The exact salary will be determined based on the candidate's qualifications and experience.
🎯 Team & Company Context
🏢 Company Culture
Industry: Future Processing operates in the IT industry, specializing in data solutions and custom software development. This role will involve working with clients from various industries, including finance, healthcare, and retail.
Company Size: Future Processing is a mid-sized company with a team of over 250 professionals. This size allows for a collaborative and agile work environment, with opportunities for growth and career development.
Founded: Future Processing was founded in 2007 and has since grown to become a trusted partner for businesses seeking innovative IT solutions.
Team Structure:
- The data solutions team consists of data engineers, data analysts, data scientists, and data architects, working collaboratively to deliver data-driven projects
- The team follows an Agile/Scrum methodology, with regular sprint planning, daily stand-ups, and retrospectives
- Cross-functional collaboration is encouraged, with close integration between data teams, software development teams, and project management teams
Development Methodology:
- The team follows best practices for data engineering, including data modeling, data warehousing, ETL/ELT processes, and data governance
- Data pipelines are designed using a modular and reusable approach, with a focus on performance, scalability, and maintainability
- The team uses version control, code reviews, and automated testing to ensure code quality and consistency
Company Website: Future Processing
📝 Enhancement Note: Future Processing's culture emphasizes collaboration, innovation, and continuous learning. Candidates who thrive in dynamic, team-oriented environments will excel in this role.
📈 Career & Growth Analysis
Web Technology Career Level: This role is suited for a senior-level data engineer with extensive experience in GCP and a strong background in data engineering. The ideal candidate will have a proven track record of delivering high-quality data solutions and mentoring junior team members.
Reporting Structure: The senior cloud data engineer will report directly to the data solutions team lead and work closely with other data engineers, data analysts, and data scientists. They will also collaborate with project managers and stakeholders to ensure data projects align with business objectives.
Technical Impact: The senior cloud data engineer will play a critical role in designing, implementing, and maintaining data pipelines and data warehouses that support data-driven decision-making. Their work will directly impact the quality, accessibility, and performance of data used by the organization and its clients.
Growth Opportunities:
- Technical Growth: Expand your skills and expertise in GCP, data engineering, and data governance by working on diverse projects and collaborating with talented team members
- Leadership Growth: Develop your leadership and mentoring skills by guiding junior team members and contributing to the team's success
- Career Progression: Demonstrate your value and potential for career growth within the organization by taking on increasingly complex projects and responsibilities
📝 Enhancement Note: Future Processing offers opportunities for career growth and professional development, with a focus on helping employees build their skills and advance their careers.
🌐 Work Environment
Office Type: Future Processing operates in a modern, open-concept office space designed to foster collaboration and creativity. The office features comfortable workspaces, multiple monitors, and testing devices to support data engineering projects.
Office Location(s): Gliwice, Śląskie, Poland
Workspace Context:
- Collaboration: The office layout encourages teamwork and communication, with dedicated spaces for meetings, brainstorming sessions, and informal discussions
- Technology: The office is equipped with state-of-the-art hardware and software, including high-performance workstations, powerful servers, and cutting-edge data engineering tools
- Flexibility: The work environment offers flexible working arrangements, with opportunities for remote work and flexible hours, depending on the project and team requirements
Work Schedule: The standard work schedule is Monday to Friday, 9:00 AM to 5:00 PM, with flexibility for project deadlines and maintenance windows. The team follows an Agile/Scrum methodology, with regular sprint planning, daily stand-ups, and retrospectives.
📝 Enhancement Note: Future Processing's work environment is designed to support collaboration, innovation, and productivity. Candidates who thrive in dynamic, team-oriented environments will excel in this role.
📄 Application & Technical Interview Process
Interview Process:
- Online Assessment: Complete an online assessment to evaluate your technical skills and problem-solving abilities (60-90 minutes)
- Technical Interview: Participate in a technical interview focused on your data engineering experience, GCP knowledge, and problem-solving skills (60-90 minutes)
- Cultural Fit Interview: Discuss your career goals, motivations, and cultural fit with the team (30-45 minutes)
- Final Decision: Receive a final decision and, if successful, an offer of employment
Portfolio Review Tips:
- Highlight your experience with GCP, data pipelines, data warehouses, and data lakes
- Include examples of your problem-solving skills and ability to optimize data processing workflows
- Showcase your attention to detail and commitment to data quality and governance
- Demonstrate your ability to work collaboratively and communicate effectively with stakeholders
Technical Challenge Preparation:
- Brush up on your SQL and Python skills, focusing on data manipulation, data analysis, and data transformation
- Familiarize yourself with GCP services, including BigQuery, Dataflow, Dataproc, and Cloud Composer
- Review data governance principles and best practices, including data cataloging, metadata management, and access control
- Prepare for behavioral questions that assess your problem-solving skills, collaboration, and communication abilities
ATS Keywords: SQL, GCP, BigQuery, Dataflow, Dataproc, Cloud Composer, ETL, ELT, Data Governance, Data Quality, Data Warehouse, Data Lake, Agile, Scrum, Collaboration, Communication, Problem Solving, Data Engineering, Data Pipeline, Data Transformation
📝 Enhancement Note: Future Processing's interview process is designed to evaluate candidates' technical skills, cultural fit, and potential for growth within the organization. Candidates who prepare thoroughly and demonstrate their passion for data engineering will have a competitive advantage.
🛠 Technology Stack & Web Infrastructure
Data Engineering Tools:
- GCP Services: BigQuery, Dataflow, Dataproc, Cloud Composer, Cloud Storage, Cloud Pub/Sub, Cloud Data Fusion, and other GCP services as needed
- Programming Languages: SQL, Python, and other relevant programming languages as needed
- Version Control: Git and GitHub for collaborative code development and version management
- CI/CD Pipelines: Jenkins, GitLab CI/CD, or other CI/CD tools for automated testing and deployment
- Data Governance: Apache Atlas, Talend Data Catalog, or other data governance tools for data cataloging, metadata management, and access control
📝 Enhancement Note: Future Processing uses a wide range of data engineering tools to support its clients' unique data needs. Candidates with experience in GCP and relevant data engineering tools will have a competitive advantage.
👥 Team Culture & Values
Data Engineering Values:
- Quality: Deliver high-quality data solutions that meet or exceed client expectations
- Collaboration: Work closely with team members, stakeholders, and clients to ensure data projects align with business objectives
- Innovation: Embrace new technologies and best practices to drive continuous improvement in data engineering processes
- Integrity: Uphold ethical standards and maintain data confidentiality, security, and compliance
Collaboration Style:
- Agile/Scrum: The team follows an Agile/Scrum methodology, with regular sprint planning, daily stand-ups, and retrospectives
- Cross-Functional: Data engineers collaborate closely with data analysts, data scientists, and other team members to ensure data projects align with business objectives
- Mentoring: Senior team members provide guidance, support, and mentoring to help junior team members develop their skills and advance their careers
📝 Enhancement Note: Future Processing's data engineering team values collaboration, innovation, and continuous learning. Candidates who share these values and thrive in dynamic, team-oriented environments will excel in this role.
⚡ Challenges & Growth Opportunities
Technical Challenges:
- Data Complexity: Work with complex, high-volume data sets and ensure data quality, consistency, and performance
- Data Silos: Break down data silos and integrate data from diverse sources to provide a holistic view of the organization's data landscape
- Data Governance: Establish and enforce data governance policies, including data cataloging, metadata management, and access control
- Data Migration: Migrate data from on-premises systems to the cloud, ensuring data integrity, security, and compliance
Learning & Development Opportunities:
- GCP Training: Participate in GCP training and certification programs to expand your skills and expertise in GCP services
- Data Engineering Conferences: Attend data engineering conferences, webinars, and workshops to stay up-to-date with industry trends and best practices
- Mentoring: Seek mentorship from senior team members and contribute to the professional development of junior team members
- Career Progression: Demonstrate your value and potential for career growth within the organization by taking on increasingly complex projects and responsibilities
📝 Enhancement Note: Future Processing offers opportunities for technical growth, career progression, and professional development. Candidates who embrace challenges and seek continuous learning will excel in this role.
💡 Interview Preparation
Technical Questions:
- Data Pipeline Design: Describe your approach to designing, implementing, and maintaining efficient, reusable, and reliable data pipelines using GCP services
- Data Warehouse & Lake Management: Explain your experience with building, maintaining, and optimizing data warehouses and lakes in BigQuery, including data modeling, data transformation, and performance optimization
- Data Governance: Discuss your understanding of data governance principles and best practices, including data cataloging, metadata management, and access control
- Problem Solving: Provide examples of your problem-solving skills and ability to address data-related challenges in previous projects
Company & Culture Questions:
- Data-Driven Decision Making: Explain how you've used data to drive decision-making in previous projects and how you would approach data-driven decision-making in this role
- Collaboration: Describe your experience working with cross-functional teams and how you've ensured data projects align with business objectives
- Innovation: Share an example of a time when you've embraced new technologies or best practices to drive continuous improvement in data engineering processes
Portfolio Presentation Strategy:
- Data Pipeline Demonstration: Showcase your experience with GCP services, including BigQuery, Dataflow, Dataproc, and Cloud Composer, by walking the interviewer through a data pipeline you've designed and implemented
- Data Warehouse & Lake Demonstration: Demonstrate your ability to build, maintain, and optimize data warehouses and lakes in BigQuery by presenting a data warehouse or data lake you've created and explaining your design choices
- Problem-Solving Demonstration: Highlight your problem-solving skills by presenting a challenging data-related issue you've faced in a previous project and explaining how you addressed it
📝 Enhancement Note: Future Processing's interview process is designed to evaluate candidates' technical skills, cultural fit, and potential for growth within the organization. Candidates who prepare thoroughly and demonstrate their passion for data engineering will have a competitive advantage.
📌 Application Steps
To apply for this Senior Cloud Data Engineer (GCP) position at Future Processing:
- Submit Your Application: Click the "Apply" button on the job listing and complete the application form with your resume, cover letter, and portfolio
- Prepare Your Portfolio: Tailor your portfolio to showcase your experience with GCP, data pipelines, data warehouses, and data lakes. Include examples of your problem-solving skills and ability to optimize data processing workflows
- Research the Company: Familiarize yourself with Future Processing's data solutions offerings, client base, and company culture. Prepare thoughtful questions to ask during the interview process
- Prepare for Technical Interviews: Brush up on your SQL and Python skills, review GCP services, and practice problem-solving exercises to ensure you're well-prepared for the technical interview
- Prepare for Cultural Fit Interviews: Reflect on your career goals, motivations, and cultural fit with the team. Prepare examples of your collaboration, communication, and problem-solving skills to demonstrate your value as a team member
⚠️ Important Notice: This enhanced job description includes AI-generated insights and data engineering industry-standard assumptions. All details should be verified directly with Future Processing before making application decisions.
Application Requirements
Candidates should have at least 5 years of IT experience, including 3.5 years working with data in GCP. Proficiency in SQL and Python, along with experience in building and maintaining data lakes and warehouses, is essential.