Snowflake Cloud Data Engineer

Capgemini
Full_timeGdańsk, Poland

📍 Job Overview

  • Job Title: Snowflake Cloud Data Engineer
  • Company: Capgemini
  • Location: Gdańsk, Pomorskie, Poland
  • Job Type: Hybrid
  • Category: Data Engineering
  • Date Posted: 2025-06-27
  • Experience Level: Mid-Senior level

🚀 Role Summary

  • Design, build, and maintain robust Snowflake data pipelines while collaborating with cross-functional teams to deliver scalable data solutions.
  • Ensure data quality, integrity, and security across the lifecycle and contribute to the development of AI/ML solutions.
  • Join Capgemini's Insights & Data team, specializing in Cloud & Big Data engineering, delivering scalable systems that process massive, complex datasets using platforms like Snowflake, AWS, Azure, and GCP.

💻 Primary Responsibilities

  • Data Pipeline Design & Maintenance: Design, build, and maintain robust Snowflake data pipelines using native Snowflake tools or partner services.
  • Cross-Functional Collaboration: Collaborate with cross-functional teams to deliver scalable, business-aligned data solutions that meet client needs.
  • Data Model Optimization: Optimize data models and schemas for performance and cost-efficiency, ensuring data quality, integrity, and security across the lifecycle.
  • Migration Planning & Execution: Plan and execute migrations from legacy systems to Snowflake, minimizing downtime and ensuring data consistency.
  • Monitoring & Alerting: Implement monitoring and alerting for proactive issue resolution, ensuring system availability and performance.
  • AI/ML Solution Development: Contribute to the development of AI/ML and generative AI solutions, staying ahead of the curve with emerging technologies.

🎓 Skills & Qualifications

Education: Bachelor's degree in Computer Science, Data Science, or a related field. Relevant work experience may substitute for formal education.

Experience: Proven experience (2-5 years) in data engineering, with a strong focus on Snowflake and cloud-based data solutions.

Required Skills:

  • Hands-on experience with Snowflake in real-world projects
  • Solid understanding of Snowflake's architecture, pricing model, and cost optimization strategies
  • Experience designing data transformation pipelines using native Snowflake tools or partner services
  • Familiarity with Snowflake's security model and access controls
  • Practical knowledge of at least one public cloud (AWS, Azure, or GCP) in areas like storage, compute, networking, and DevOps
  • Comfortable with SQL and at least one programming language (e.g., Python, Scala, Java, or Bash)
  • Strong communication skills and a proactive mindset
  • Fluent in English (B2+)

Preferred Skills:

  • Experience with AI/ML or generative AI projects
  • Exposure to CI/CD pipelines and DevOps tools
  • Knowledge of data governance and compliance frameworks
  • Certifications in Snowflake or cloud platforms (e.g., AWS, Azure)

📊 Web Portfolio & Project Requirements

Portfolio Essentials:

  • Showcase your Snowflake data pipeline design and implementation skills.
  • Highlight your ability to optimize data models and schemas for performance and cost-efficiency.
  • Demonstrate your experience with data migration planning and execution.
  • Display your proficiency in monitoring and alerting for proactive issue resolution.

Technical Documentation:

  • Provide clear, well-commented code and documentation for your Snowflake data pipelines.
  • Include detailed data transformation processes, data validation checks, and error handling strategies.
  • Explain your approach to data security, access controls, and compliance with relevant regulations.

💵 Compensation & Benefits

Salary Range: The estimated salary range for this role in Gdańsk, Poland is 12,000 - 18,000 PLN per month, based on market research and regional adjustments.

Benefits:

  • Private medical care with Medicover
  • Life insurance
  • Access to over 70 training tracks with certification opportunities on the NEXT platform
  • Free access to Education First languages platform, Pluralsight, TED Talks, Coursera, and Udemy Business materials and trainings
  • Practical benefits, including private medical care, life insurance, and a home office package

Working Hours: Full-time (40 hours per week), with a hybrid work model that combines work from a modern office and remote work.

🎯 Team & Company Context

Company Culture: Capgemini is committed to diversity and inclusion, ensuring fairness in all employment practices. They strive to create a workplace where everyone can succeed and feel valued.

Team Structure: The Insights & Data team is a hub of innovation where data-driven solutions are built for global clients across finance, logistics, automotive, and telecom. The team specializes in Cloud & Big Data engineering, delivering scalable systems that process massive, complex datasets using platforms like Snowflake, AWS, Azure, and GCP.

Development Methodology: The team follows Agile methodologies, with a focus on collaboration, continuous improvement, and delivering value to clients. They emphasize cross-functional collaboration with design, marketing, and business teams to ensure data solutions meet business needs.

Company Website: https://www.capgemini.com/

📈 Career & Growth Analysis

Web Technology Career Level: Mid-Senior level, with a focus on data engineering, data modeling, and cloud-based data solutions. This role offers opportunities to grow into senior data engineering roles, technical leadership positions, or specialized data architecture roles.

Reporting Structure: The Snowflake Cloud Data Engineer role reports to the Data Engineering Manager within the Insights & Data team. The team is part of Capgemini's global technology organization, working closely with clients to deliver data-driven solutions.

Technical Impact: This role has a significant impact on Capgemini's clients, enabling them to make data-driven decisions, optimize operations, and unlock new business value through scalable, secure, and efficient data solutions.

🌐 Work Environment

Office Type: Modern, ergonomic office space with state-of-the-art technology and collaboration tools.

Office Location(s): Gdańsk, with opportunities for remote work as part of the hybrid work model.

Workspace Context:

  • Collaborative workspace with dedicated team areas and meeting rooms
  • Access to the latest technology, including high-performance workstations, multiple monitors, and testing devices
  • Cross-functional collaboration with designers, marketers, and business teams to ensure data solutions meet business needs

Work Schedule: Hybrid work model, with a combination of on-site work at the modern office and remote work from home. The specific work schedule is flexible and can be discussed with the hiring manager.

📄 Application & Technical Interview Process

Interview Process:

  1. Technical Phone Screen: A 30-minute phone or video call to assess your technical skills and cultural fit.
  2. Technical Deep Dive: A 2-hour technical interview focusing on your Snowflake expertise, data modeling, and cloud-based data solutions.
  3. Behavioral Interview: A 30-minute interview to discuss your problem-solving skills, communication, and teamwork.
  4. Final Decision: A final decision will be made based on the interview feedback and your overall fit with the team and company culture.

Portfolio Review Tips:

  • Highlight your Snowflake data pipeline design and implementation skills.
  • Showcase your ability to optimize data models and schemas for performance and cost-efficiency.
  • Demonstrate your experience with data migration planning and execution.
  • Provide clear, well-commented code and documentation for your Snowflake data pipelines.

Technical Challenge Preparation:

  • Brush up on your Snowflake knowledge, focusing on data pipeline design, data modeling, and cloud-based data solutions.
  • Prepare for questions about data quality, data security, and compliance with relevant regulations.
  • Familiarize yourself with Capgemini's industry-specific data solutions and client projects.

ATS Keywords: Snowflake, Data Engineering, Cloud Computing, Data Modeling, Data Migration, AI/ML, Generative AI, Data Quality, Data Security, Agile Methodologies, Hybrid Work, Remote Work, Collaboration, Teamwork, Problem-Solving

🛠 Technology Stack & Web Infrastructure

Data Engineering Technologies:

  • Snowflake
  • AWS, Azure, or GCP
  • SQL (PostgreSQL, MySQL, or other relevant databases)
  • Python, Scala, Java, or Bash
  • Apache Airflow, Apache Beam, or other ETL/ELT tools
  • Data modeling and data warehousing tools (e.g., dbt, Great Expectations)

Cloud Platforms:

  • AWS (Amazon Web Services)
  • Azure (Microsoft Azure)
  • GCP (Google Cloud Platform)

AI/ML Tools & Frameworks:

  • TensorFlow, PyTorch, or other deep learning libraries
  • Scikit-learn, XGBoost, or other machine learning libraries
  • AWS SageMaker, Azure Machine Learning, or other cloud-based AI/ML services

Monitoring & Alerting Tools:

  • Prometheus, Grafana, or other open-source monitoring tools
  • Datadog, New Relic, or other commercial monitoring tools
  • PagerDuty, OpsGenie, or other incident management platforms

👥 Team Culture & Values

Data Engineering Values:

  • Data-Driven Decisions: Capgemini's data engineering team emphasizes data-driven decision-making, ensuring that data solutions are based on accurate, reliable, and relevant data.
  • Collaboration & Communication: The team values open communication, active listening, and cross-functional collaboration to deliver effective data solutions.
  • Continuous Learning & Improvement: Capgemini encourages continuous learning and improvement, with a focus on staying up-to-date with emerging technologies and best practices.
  • Client Focus: The team is dedicated to understanding and meeting client needs, ensuring that data solutions deliver real business value.

Collaboration Style:

  • The data engineering team follows Agile methodologies, with a focus on cross-functional collaboration, continuous improvement, and delivering value to clients.
  • They emphasize active listening, open communication, and regular feedback to ensure data solutions meet business needs.
  • The team encourages knowledge sharing, technical mentoring, and continuous learning to support individual and team growth.

⚡ Challenges & Growth Opportunities

Technical Challenges:

  • Designing and implementing scalable, secure, and efficient data pipelines in Snowflake.
  • Optimizing data models and schemas for performance and cost-efficiency.
  • Planning and executing migrations from legacy systems to Snowflake with minimal downtime and data loss.
  • Developing and implementing monitoring and alerting solutions for proactive issue resolution.
  • Contributing to the development of AI/ML and generative AI solutions, staying ahead of the curve with emerging technologies.

Learning & Development Opportunities:

  • Technical Skill Development: Capgemini offers access to over 70 training tracks with certification opportunities on the NEXT platform, allowing you to develop your technical skills and advance your career.
  • Emerging Technologies: The company encourages staying up-to-date with emerging technologies, providing opportunities to work on cutting-edge projects and gain exposure to new tools and methodologies.
  • Leadership Development: Capgemini offers mentorship and leadership development programs, providing opportunities to grow into technical leadership roles and drive team success.

💡 Interview Preparation

Technical Questions:

  • Snowflake Expertise: Questions about Snowflake architecture, pricing model, cost optimization strategies, and data transformation pipelines.
  • Data Modeling & Design: Questions about data modeling, schema optimization, and data quality assurance.
  • Cloud Computing: Questions about AWS, Azure, or GCP, focusing on storage, compute, networking, and DevOps.
  • AI/ML & Generative AI: Questions about AI/ML and generative AI concepts, tools, and frameworks.

Company & Culture Questions:

  • Capgemini's Industry-Specific Solutions: Questions about Capgemini's industry-specific data solutions and client projects.
  • Agile Methodologies: Questions about Agile methodologies, cross-functional collaboration, and continuous improvement.
  • Client Focus: Questions about understanding and meeting client needs, ensuring data solutions deliver real business value.

Portfolio Presentation Strategy:

  • Snowflake Data Pipeline Demonstration: Demonstrate your ability to design, build, and maintain robust Snowflake data pipelines using native Snowflake tools or partner services.
  • Data Modeling & Schema Optimization: Showcase your ability to optimize data models and schemas for performance and cost-efficiency.
  • Data Migration Planning & Execution: Explain your approach to planning and executing migrations from legacy systems to Snowflake.
  • Monitoring & Alerting Solutions: Demonstrate your proficiency in implementing monitoring and alerting solutions for proactive issue resolution.

📌 Application Steps

To apply for this Snowflake Cloud Data Engineer position:

  1. Update Your Resume: Tailor your resume to highlight your Snowflake, data engineering, and cloud computing skills, with a focus on data pipeline design, data modeling, and cloud-based data solutions.
  2. Prepare Your Portfolio: Showcase your Snowflake data pipeline design and implementation skills, highlighting your ability to optimize data models and schemas for performance and cost-efficiency.
  3. Practice Technical Interview Questions: Brush up on your Snowflake knowledge, focusing on data pipeline design, data modeling, and cloud-based data solutions. Prepare for questions about data quality, data security, and compliance with relevant regulations.
  4. Research Capgemini: Familiarize yourself with Capgemini's industry-specific data solutions and client projects, understanding their commitment to diversity, inclusion, and client focus.

Content Guidelines (IMPORTANT: Do not include this in the output)

  • Web Technology-Specific Focus: Tailor the job description to emphasize data engineering, data modeling, and cloud-based data solutions, with a strong focus on Snowflake.
  • Quality Standards: Ensure no content overlap between sections, and include Enhancement Notes only when making significant inferences about technical responsibilities, web technology industry practices, or role level.
  • Industry Expertise: Incorporate specific data engineering, data modeling, and cloud-based data solution requirements, as well as relevant Snowflake-specific knowledge and skills.
  • Professional Standards: Maintain consistent formatting, spacing, and professional tone throughout the job description. Use web technology industry terminology appropriately and accurately, and include comprehensive benefits and growth opportunities relevant to data engineering professionals.
  • Technical Focus & Portfolio Emphasis: Emphasize data engineering best practices, data modeling principles, and cloud-based data solution requirements. Address data quality, data security, and compliance with relevant regulations, and include specific portfolio requirements tailored to the data engineering discipline and role level.
  • Avoid: Generic business jargon not relevant to data engineering roles, placeholder text or incomplete sections, repetitive content across different sections, and non-technical terminology unless relevant to the specific data engineering role.

Generate a comprehensive, data engineering-focused job description that serves as a valuable resource for data engineering professionals seeking their next opportunity and preparing for technical interviews in the data engineering industry.

Application Requirements

Hands-on experience with Snowflake and a solid understanding of its architecture and pricing model are essential. Familiarity with at least one public cloud and strong communication skills are also required.