Distributed Cloud l Google Data Project

Devoteam
Full_timePorto, Portugal

📍 Job Overview

  • Job Title: Senior Cloud Data Engineer - Google Data Project
  • Company: Devoteam
  • Location: Porto, Portugal
  • Job Type: Full-time
  • Category: Data Engineer
  • Date Posted: July 24, 2025
  • Experience Level: Mid-Senior level (5-10 years)
  • Remote Status: On-site (Porto, Portugal)

🚀 Role Summary

  • Lead end-to-end data projects focused on the engineering component within the Google Cloud Platform (GCP) ecosystem.
  • Collaborate with a multidisciplinary team of Cloud experts, designers, business consultants, engineers, and developers to deliver innovative solutions.
  • Contribute to Devoteam's mission of transforming technology to create value for clients, partners, and employees in a world where technology is developed for people.

📝 Enhancement Note: This role requires a strong background in data engineering and GCP data services to drive successful project delivery and contribute to Devoteam's tech-for-people culture.

💻 Primary Responsibilities

  • Project Delivery: Lead data projects with a focus on the engineering component, working with GCP data services such as BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub, and Dataplex.
  • Data Processing: Write efficient SQL queries and develop data processing pipelines using programming frameworks like Apache Beam and CI/CD automatisms.
  • Data Integration & Streaming: Handle data ingestion from various sources into GCP, including data integration and streaming using tools like Apache Kafka.
  • Workflow Orchestration: Build and manage data pipelines, with a deep understanding of workflow orchestration, task scheduling, and dependency management.
  • Collaboration: Work closely with cross-functional teams to ensure data engineering tasks align with project goals and deliverables.

📝 Enhancement Note: This role requires a balance of technical expertise and collaborative skills to effectively manage data projects and contribute to Devoteam's diverse and dynamic team environment.

🎓 Skills & Qualifications

Education: Bachelor's degree in IT or a similar field.

Experience: 4+ years of professional experience in a data engineering role.

Required Skills:

  • Proficient in GCP data services (BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub, Dataplex)
  • Strong SQL skills
  • Programming languages: Python, Java (mandatory)
  • Experience with tools like Apache Airflow, Google Cloud Composer, or Cloud Data Fusion
  • Code-review mindset
  • Familiarity with Terraform, GitHub, GitHub Actions, Bash, and/or Docker
  • Knowledge of streaming data processing using tools like Apache Kafka

Preferred Skills:

  • GCP certifications (a plus)
  • Proficiency in English (written and spoken)

📝 Enhancement Note: While the role requires a strong foundation in data engineering and GCP technologies, candidates with a diverse skill set and a willingness to learn will thrive in Devoteam's dynamic environment.

📊 Web Portfolio & Project Requirements

Portfolio Essentials:

  • Demonstrate a strong understanding of GCP data services by showcasing relevant projects and case studies.
  • Highlight data processing pipelines, data integration, and streaming data processing examples.
  • Showcase your ability to write efficient SQL queries and manage data workflows.

Technical Documentation:

  • Provide clear and concise documentation for your data engineering projects, including data sources, processing steps, and output formats.
  • Include any relevant code snippets or scripts used in your projects to demonstrate your technical proficiency.

📝 Enhancement Note: As a Senior Cloud Data Engineer, your portfolio should showcase your ability to lead data projects, manage data pipelines, and make informed decisions about data processing and integration strategies.

💵 Compensation & Benefits

Salary Range: €45,000 - €65,000 per year (based on experience and local market rates in Porto, Portugal)

Benefits:

  • Competitive salary and benefits package
  • Opportunities for professional growth and development within a global organization
  • Dynamic and diverse work environment with a strong focus on technology and people
  • Equal opportunities and an active fight against all forms of discrimination

Working Hours: Full-time (40 hours/week) with flexible working arrangements and a focus on results and delivery.

📝 Enhancement Note: While the salary range is based on local market rates and experience, Devoteam offers a comprehensive benefits package and opportunities for professional growth that make it an attractive employer for data engineering professionals.

🎯 Team & Company Context

🏢 Company Culture

Industry: Global technology consulting and digital transformation services, with a focus on cloud, data, and cybersecurity.

Company Size: Large (10,000+ employees) with a presence in over 20 EMEA countries.

Founded: 1993, with a strong history of growth and innovation in the technology industry.

Team Structure:

  • Multidisciplinary teams consisting of Cloud experts, designers, business consultants, engineers, developers, and other specialists.
  • Collaborative and dynamic work environment, fostering creativity and technology-driven problem-solving.

Development Methodology:

  • Agile and iterative development processes, with a focus on delivering value to clients and partners.
  • Strong emphasis on continuous learning, improvement, and innovation.

Company Website: Devoteam Group

📝 Enhancement Note: Devoteam's culture is centered around technology for people, fostering a dynamic and collaborative work environment that values diversity, creativity, and continuous learning.

📈 Career & Growth Analysis

Web Technology Career Level: Senior Cloud Data Engineer, responsible for leading data projects, managing data pipelines, and driving technical decisions within the GCP ecosystem.

Reporting Structure: Report directly to the Data Engineering team lead, collaborating with cross-functional teams to deliver data projects and contribute to Devoteam's tech-for-people mission.

Technical Impact: Contribute to the development and optimization of data processing pipelines, data integration, and streaming data processing workflows, ensuring data quality, performance, and reliability.

Growth Opportunities:

  • Technical leadership and mentoring opportunities within the data engineering team.
  • Opportunities to specialize in specific GCP data services or emerging technologies.
  • Potential to take on more complex projects and drive strategic data initiatives within the organization.

📝 Enhancement Note: As a Senior Cloud Data Engineer at Devoteam, you will have the opportunity to grow both technically and professionally, contributing to the company's success and driving your own career development.

🌐 Work Environment

Office Type: Modern and collaborative office space in Porto, Portugal, designed to foster creativity and innovation.

Office Location(s): Porto, Portugal (Av. dos Aliados, 4000 Porto, Portugal)

Workspace Context:

  • Access to state-of-the-art technology and tools to support data engineering projects.
  • Collaborative workspaces designed to facilitate cross-functional team collaboration and communication.
  • Flexible working arrangements, with a focus on results and delivery.

Work Schedule: Full-time (40 hours/week) with flexible working arrangements and a focus on results and delivery.

📝 Enhancement Note: Devoteam's work environment is designed to support collaboration, innovation, and continuous learning, fostering a dynamic and engaging workspace for data engineering professionals.

📄 Application & Technical Interview Process

Interview Process:

  1. Technical Screening: Demonstrate your technical proficiency in GCP data services, SQL, and data processing pipelines through a hands-on assessment or case study.
  2. Cultural Fit Interview: Discuss your alignment with Devoteam's tech-for-people culture, values, and mission.
  3. Final Interview: Present your portfolio, discuss your approach to data engineering, and answer any remaining questions from the hiring team.

Portfolio Review Tips:

  • Highlight your experience with GCP data services, data processing pipelines, and data integration workflows.
  • Showcase your ability to write efficient SQL queries and manage data workflows.
  • Include any relevant code snippets or scripts to demonstrate your technical proficiency.

Technical Challenge Preparation:

  • Brush up on your GCP data services knowledge, focusing on BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub, and Dataplex.
  • Practice writing efficient SQL queries and managing data workflows using relevant tools and frameworks.
  • Familiarize yourself with Apache Airflow, Google Cloud Composer, or Cloud Data Fusion, as well as other relevant data engineering tools.

ATS Keywords: (Organized by category)

  • Programming Languages: Python, Java, SQL
  • GCP Data Services: BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub, Dataplex
  • Data Engineering Tools: Apache Airflow, Google Cloud Composer, Cloud Data Fusion, Apache Kafka, Terraform, GitHub, GitHub Actions, Bash, Docker
  • Methodologies: Agile, CI/CD
  • Soft Skills: Collaboration, communication, problem-solving, leadership, mentoring

📝 Enhancement Note: To succeed in the interview process, focus on demonstrating your technical proficiency in GCP data services and data engineering, as well as your alignment with Devoteam's tech-for-people culture and values.

🛠 Technology Stack & Web Infrastructure

Frontend Technologies: (Not applicable for this role)

Backend & Server Technologies:

  • GCP Data Services: BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub, Dataplex
  • Programming Languages: Python, Java, SQL
  • Data Engineering Tools: Apache Airflow, Google Cloud Composer, Cloud Data Fusion, Apache Kafka, Terraform, GitHub, GitHub Actions, Bash, Docker

Development & DevOps Tools:

  • CI/CD: GitHub Actions, Jenkins (if applicable)
  • Monitoring & Logging: Stackdriver, ELK Stack (if applicable)
  • Infrastructure as Code (IaC): Terraform, CloudFormation (if applicable)

📝 Enhancement Note: As a Senior Cloud Data Engineer, you will work extensively with GCP data services and relevant data engineering tools to deliver data projects and manage data pipelines.

👥 Team Culture & Values

Web Development Values:

  • Technical Excellence: Pursue continuous learning and improvement in GCP data services and data engineering best practices.
  • Collaboration: Work closely with cross-functional teams to deliver data projects and drive innovation.
  • Performance Optimization: Focus on data quality, performance, and reliability to ensure optimal data processing and integration workflows.
  • User Experience: Consider the user impact of data-driven decisions and strive to create value for clients, partners, and employees.

Collaboration Style:

  • Cross-functional Integration: Work closely with designers, business consultants, engineers, and developers to ensure data engineering tasks align with project goals and deliverables.
  • Code Review Culture: Foster a code-review mindset to ensure data processing pipelines are efficient, reliable, and maintainable.
  • Knowledge Sharing: Contribute to Devoteam's culture of continuous learning and knowledge sharing by mentoring team members and participating in relevant training and development opportunities.

📝 Enhancement Note: Devoteam's team culture is centered around collaboration, innovation, and continuous learning, fostering a dynamic and engaging work environment for data engineering professionals.

⚡ Challenges & Growth Opportunities

Technical Challenges:

  • Data Integration: Develop and manage data integration workflows, including data ingestion, transformation, and loading (ETL) processes.
  • Data Streaming: Design and implement real-time data processing pipelines using tools like Apache Kafka and GCP Pub/Sub.
  • Data Warehousing: Design and optimize data warehouses and data lakes to support business intelligence and analytics initiatives.
  • Emerging Technologies: Stay up-to-date with the latest GCP data services and data engineering trends, and explore opportunities to incorporate new technologies into your projects.

Learning & Development Opportunities:

  • GCP Certifications: Pursue relevant GCP certifications to enhance your technical proficiency and demonstrate your commitment to continuous learning.
  • Conferences & Events: Attend industry conferences, webinars, and meetups to stay informed about the latest trends and best practices in data engineering and GCP data services.
  • Mentoring & Coaching: Seek out mentoring opportunities within Devoteam to grow both technically and professionally, and consider providing mentorship to junior team members.

📝 Enhancement Note: As a Senior Cloud Data Engineer at Devoteam, you will face technical challenges that require creativity, innovation, and a deep understanding of GCP data services and data engineering best practices. Embrace these challenges as opportunities for growth and learning.

💡 Interview Preparation

Technical Questions:

  • GCP Data Services: Demonstrate your proficiency in GCP data services, including BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub, and Dataplex.
  • SQL: Showcase your ability to write efficient SQL queries and optimize data processing workflows.
  • Data Engineering: Explain your approach to data integration, data streaming, and data warehousing, and discuss any relevant projects or case studies.

Company & Culture Questions:

  • Tech-for-People: Discuss your understanding of Devoteam's tech-for-people culture and how you would contribute to it as a Senior Cloud Data Engineer.
  • Collaboration: Describe your experience working with cross-functional teams and your approach to fostering a collaborative work environment.
  • Problem-Solving: Share an example of a complex data engineering challenge you've faced and how you approached solving it.

Portfolio Presentation Strategy:

  • Project Selection: Choose relevant data engineering projects that showcase your technical proficiency in GCP data services and data processing pipelines.
  • Storytelling: Prepare a compelling narrative that highlights the challenges you faced, the solutions you implemented, and the value you delivered through your projects.
  • Demonstration: Include live demonstrations or interactive elements to engage the interview panel and showcase your technical skills.

📝 Enhancement Note: To succeed in the interview process, focus on demonstrating your technical proficiency in GCP data services and data engineering, as well as your alignment with Devoteam's tech-for-people culture and values. Prepare thoughtful responses to technical and cultural questions, and be ready to discuss your approach to data engineering challenges and opportunities.

📌 Application Steps

To apply for this Senior Cloud Data Engineer position at Devoteam:

  1. Customize Your Portfolio: Tailor your portfolio to highlight your experience with GCP data services, data processing pipelines, and data integration workflows. Include any relevant code snippets or scripts to demonstrate your technical proficiency.
  2. Optimize Your Resume: Emphasize your technical skills, experience with GCP data services, and any relevant certifications or training. Highlight your problem-solving abilities, collaborative skills, and commitment to continuous learning.
  3. Prepare for Technical Screening: Brush up on your GCP data services knowledge, SQL skills, and data processing pipeline management. Practice writing efficient SQL queries and managing data workflows using relevant tools and frameworks.
  4. Research Devoteam: Familiarize yourself with Devoteam's tech-for-people culture, values, and mission. Understand their approach to data engineering, collaboration, and innovation within the GCP ecosystem.
  5. Prepare for Cultural Fit Interview: Reflect on your alignment with Devoteam's values and culture, and prepare thoughtful responses to questions about your approach to collaboration, problem-solving, and continuous learning.

⚠️ Important Notice: This enhanced job description includes AI-generated insights and web development industry-standard assumptions. All details should be verified directly with the hiring organization before making application decisions.

Application Requirements

Candidates should have a bachelor's degree in IT or a similar field and at least 4 years of professional experience in a data engineering role. Experience with GCP Data Services and knowledge of programming languages such as Python, Java, and SQL are mandatory.