Data Platform Infrastructure Engineer

Groupon
Full_time

πŸ“ Job Overview

  • Job Title: Data Platform Infrastructure Engineer
  • Company: Groupon
  • Location: Remote - Peru
  • Job Type: Full-Time
  • Category: DevOps Engineer, Infrastructure Engineer
  • Date Posted: 2025-07-23
  • Experience Level: Mid-Senior level (5-10 years)
  • Remote Status: Remote OK

πŸš€ Role Summary

  • Key Responsibilities: Design, build, and automate scalable data infrastructure across GCP and AWS while ensuring platform reliability, security, and cost-efficiency. Collaborate with various teams to support data-driven initiatives and drive the platform forward.
  • Key Technologies: GCP, AWS, Terraform, Python, Shell, Big Data frameworks (Google Dataproc, BigQuery, Spark, Airflow), Monitoring Tools, Data Security, Machine Learning

πŸ’» Primary Responsibilities

  • πŸ“ Enhancement Note: This role requires a strong focus on infrastructure as code, cloud platforms, and big data technologies to support Groupon's data-driven initiatives.

  • πŸš€ Infrastructure Design & Automation: Design, build, and automate scalable data infrastructure across GCP and AWS using an infrastructure as code approach with Terraform.

  • πŸ›‘οΈ Platform Reliability & Security: Ensure platform reliability, security, and cost-efficiency through robust monitoring, automation, and compliance practices.

  • 🀝 Collaboration: Work closely with Data Operations, Tools, and Pipeline teams to support data-driven initiatives and drive the platform forward.

  • πŸ› οΈ Pipeline Development & Maintenance: Develop and maintain secure, scalable data pipelines and infrastructure using an infrastructure as code approach.

  • πŸ’‘ Innovation & Best Practices: Champion cloud-first best practices and foster a culture of high performance, innovation, and accountability.

  • 🌟 Agile Development: Engage in the agile development process, leveraging async communication and shared ownership to support fast-paced business needs.

πŸŽ“ Skills & Qualifications

Education: Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.

Experience: 5+ years of hands-on experience in Cloud Infrastructure, Data Platform Engineering, or DevOps.

Required Skills:

  • Strong expertise in cloud platforms, preferably GCP or AWS (including IAM, VPC, Storage, BigQuery, Cloud Functions)
  • Proficiency in Infrastructure as Code with Terraform
  • Good programming and scripting skills in Python and Shell
  • Deep knowledge of Big Data frameworks such as Google Dataproc, GCS Storage, BigQuery, BigTable, Hive, Spark, and Airflow
  • Excellent communication and collaboration skills, with the ability to work effectively in a globally distributed team
  • Strong sense of ownership, self-management, and a commitment to building for scale and long-term maintainability

Preferred Skills:

  • Familiarity with monitoring and observability tools like Grafana, Elasticsearch, Prometheus, or Wavefront
  • Experience implementing access controls, data encryption, and compliance standards (e.g., SOX)
  • Cross-cloud expertise (strong in either AWS or GCP with a willingness to learn the other)
  • Experience working with or supporting ML platforms or AI infrastructure
  • Prior experience mentoring junior engineers or contributing to team leadership

πŸ“Š Web Portfolio & Project Requirements

πŸ“ Enhancement Note: As this role focuses on infrastructure engineering, a portfolio showcasing relevant projects, code samples, and case studies demonstrating your expertise in cloud infrastructure, data pipelines, and big data technologies is essential.

Portfolio Essentials:

  • 🏒 Infrastructure Projects: Highlight your experience designing, building, and automating scalable data infrastructure on GCP or AWS using Terraform.
  • πŸ”’ Security & Compliance: Demonstrate your understanding of data security, access controls, and compliance standards by showcasing relevant projects or case studies.
  • πŸ“ˆ Data Pipelines: Showcase your ability to develop and maintain secure, scalable data pipelines using big data technologies like Google Dataproc, BigQuery, Spark, or Airflow.
  • 🌐 Cloud Architecture: Illustrate your expertise in cloud architecture by presenting your approach to designing and implementing reliable, cost-efficient, and secure data platforms.

Technical Documentation:

  • πŸ“„ Code Quality: Demonstrate your commitment to code quality, commenting, and documentation standards by providing code samples or project documentation.
  • πŸ”„ Deployment Processes: Showcase your experience with version control, deployment processes, and server configuration by explaining your approach to infrastructure as code and automation.
  • πŸ“ˆ Performance Metrics: Highlight your understanding of testing methodologies, performance metrics, and optimization techniques by discussing your approach to monitoring and optimizing data pipelines and infrastructure.

πŸ’΅ Compensation & Benefits

Salary Range: The salary range for this role is not specified. According to Glassdoor, the average salary for a Data Engineer in Peru is approximately 35,000 PEN per month. However, this can vary depending on factors such as experience, skills, and company size.

Benefits:

  • Competitive salary and benefits package
  • Opportunities for professional growth and development
  • Collaborative and innovative work environment
  • Global team with diverse perspectives and backgrounds

Working Hours: Full-time position with a standard workweek of 40 hours. Flexible working hours may be available to accommodate different time zones and project deadlines.

πŸ“ Enhancement Note: The provided salary range is an estimate based on market research and may vary depending on the candidate's experience, skills, and the company's internal compensation structure.

🎯 Team & Company Context

🏒 Company Culture

Industry: Groupon is a global leader in local e-commerce, connecting customers with local merchants by offering amazing deals on products, services, and experiences.

Company Size: Groupon has a large, globally distributed team with a strong focus on innovation, collaboration, and continuous learning.

Founded: Groupon was founded in 2008 and has since grown to become a publicly traded company with a presence in over 15 countries.

Team Structure:

  • 🌐 Data & Discovery: The Data & Discovery organization is responsible for driving Groupon's data strategy, enabling data-driven decision-making, and empowering teams to leverage AI and machine learning.
  • 🀝 Cross-Functional Collaboration: The team works closely with various departments, including Product, Engineering, and Marketing, to ensure data-driven initiatives align with business objectives and user needs.
  • 🌍 Global Distribution: The team is globally distributed, with members working remotely from various locations around the world.

Development Methodology:

  • πŸ”„ Agile Development: The team follows an agile development process, leveraging async communication and shared ownership to support fast-paced business needs.
  • πŸ“ˆ Data-Driven Decision Making: The team prioritizes data-driven decision-making, using data and analytics to inform product development, marketing strategies, and business operations.
  • 🀝 Cross-Functional Collaboration: The team works closely with various departments, including Product, Engineering, and Marketing, to ensure data-driven initiatives align with business objectives and user needs.

Company Website: Groupon

πŸ“ Enhancement Note: Groupon's culture values innovation, collaboration, and continuous learning, providing an ideal environment for professionals seeking to grow and make a significant impact on a global scale.

πŸ“ˆ Career & Growth Analysis

🌱 Web Technology Career Level: This role is suitable for mid-senior level professionals with 5-10 years of experience in cloud infrastructure, data platform engineering, or DevOps. The role offers opportunities for growth and leadership within the Data & Discovery organization and Groupon as a whole.

πŸ‘₯ Reporting Structure: The Data Platform Infrastructure Engineer will report directly to the Manager of Data Infrastructure within the Data & Discovery organization.

πŸ“ˆ Technical Impact: This role has a significant impact on Groupon's data ecosystem, enabling data-driven initiatives, and driving the company's AI-first strategy.

🌱 Growth Opportunities:

  • πŸš€ Technical Leadership: As the first Data Platform Infrastructure Engineer on the team, this role presents an opportunity to shape the team's technical direction and contribute to the development of best practices and standards.
  • 🌟 Mentorship & Leadership: The role provides opportunities to mentor junior engineers, contribute to team leadership, and help shape the team's culture and processes.
  • πŸ’‘ Continuous Learning: Working on cutting-edge technologies and collaborating with a diverse team of professionals offers ample opportunities for continuous learning and professional development.

πŸ“ Enhancement Note: This role offers a unique opportunity for mid-senior level professionals to make a significant impact on Groupon's data ecosystem and drive the company's AI-first strategy while growing both technically and professionally.

🌐 Work Environment

🏒 Office Type: Groupon offers a remote-friendly work environment, with team members working from various locations around the world.

πŸ“ Office Location(s): Groupon has offices in various locations worldwide, including Chicago, London, and Beijing. However, this role is remote and can be performed from anywhere with a reliable internet connection.

🌐 Workspace Context:

  • πŸ’» Remote Work: The remote work environment allows for flexibility and work-life balance, with team members able to work from the comfort of their own homes or preferred co-working spaces.
  • 🀝 Collaboration: Despite being remote, the team maintains a strong focus on collaboration, with regular virtual team meetings, async communication, and shared ownership of projects and initiatives.
  • 🌐 Global Perspective: Working with a globally distributed team exposes professionals to diverse perspectives, cultures, and ways of working, fostering a rich and inclusive work environment.

πŸ•’ Work Schedule: The standard workweek is 40 hours, with flexible working hours available to accommodate different time zones and project deadlines.

πŸ“ Enhancement Note: Groupon's remote-friendly work environment offers professionals the flexibility to balance work and personal responsibilities while collaborating with a diverse and inclusive global team.

πŸ“„ Application & Technical Interview Process

πŸ“ Enhancement Note: The application and interview process for this role are designed to assess the candidate's technical skills, cultural fit, and ability to thrive in a remote, globally distributed team.

πŸ“ Interview Process:

  • πŸš€ Technical Assessment: The first step in the interview process involves a technical assessment, focusing on the candidate's expertise in cloud infrastructure, data pipelines, and big data technologies. This may include a take-home project or a live coding challenge.
  • 🀝 Team Fit Assessment: The second step involves a team fit assessment, where the candidate will meet with members of the Data & Discovery organization to discuss their approach to collaboration, communication, and problem-solving in a remote work environment.
  • πŸ’‘ Cultural Fit Assessment: The final step in the interview process focuses on assessing the candidate's cultural fit with Groupon's values and work environment. This may involve a conversation with a member of the leadership team or a panel interview with a diverse group of stakeholders.

πŸ“ Portfolio Review Tips:

  • 🏒 Infrastructure Projects: Highlight your experience designing, building, and automating scalable data infrastructure on GCP or AWS using Terraform. Be prepared to discuss your approach to infrastructure as code, cloud architecture, and data security.
  • πŸ”’ Security & Compliance: Demonstrate your understanding of data security, access controls, and compliance standards by showcasing relevant projects or case studies. Be prepared to discuss your approach to implementing security best practices and ensuring compliance with relevant standards and regulations.
  • πŸ“ˆ Data Pipelines: Showcase your ability to develop and maintain secure, scalable data pipelines using big data technologies like Google Dataproc, BigQuery, Spark, or Airflow. Be prepared to discuss your approach to data processing, transformation, and analysis.
  • 🌐 Cloud Architecture: Illustrate your expertise in cloud architecture by presenting your approach to designing and implementing reliable, cost-efficient, and secure data platforms. Be prepared to discuss your experience with cloud providers like GCP or AWS and your understanding of cloud best practices and architectural patterns.

πŸ“ Technical Challenge Preparation:

  • 🏒 Infrastructure Design: Brush up on your knowledge of cloud infrastructure, data pipelines, and big data technologies. Familiarize yourself with the latest best practices and trends in cloud architecture, data security, and compliance.
  • πŸ”’ Security & Compliance: Review your understanding of data security, access controls, and compliance standards. Ensure you are up-to-date with the latest security best practices and regulations relevant to cloud infrastructure and big data technologies.
  • πŸ“ˆ Data Pipelines: Refresh your knowledge of big data technologies like Google Dataproc, BigQuery, Spark, or Airflow. Practice designing and implementing data pipelines using these tools and familiarize yourself with their features and limitations.
  • 🌐 Cloud Architecture: Review your understanding of cloud architecture, cloud providers like GCP or AWS, and cloud best practices. Practice designing and implementing cloud architectures that are reliable, cost-efficient, and secure.

πŸ“ ATS Keywords: Infrastructure as Code, Terraform, GCP, AWS, Cloud Infrastructure, Data Platform Engineering, DevOps, Big Data, Google Dataproc, BigQuery, Spark, Airflow, Data Security, Machine Learning, Remote Work, Global Team, Agile Development, Data-Driven Decision Making

πŸ“ Enhancement Note: The interview process for this role is designed to assess the candidate's technical skills, cultural fit, and ability to thrive in a remote, globally distributed team. By preparing thoroughly and showcasing your expertise in cloud infrastructure, data pipelines, and big data technologies, you can increase your chances of success in the interview process.

πŸ› οΈ Technology Stack & Web Infrastructure

🏒 Frontend Technologies: N/A (This role focuses on cloud infrastructure and data pipelines)

πŸ’» Backend & Server Technologies:

  • 🌐 Cloud Platforms: GCP and AWS (including IAM, VPC, Storage, BigQuery, Cloud Functions)
  • πŸ› οΈ Infrastructure as Code: Terraform
  • πŸ’» Programming & Scripting: Python and Shell
  • πŸ“ˆ Big Data Technologies: Google Dataproc, GCS Storage, BigQuery, BigTable, Hive, Spark, and Airflow

πŸ› οΈ Development & DevOps Tools:

  • πŸ› οΈ Infrastructure as Code: Terraform
  • πŸ“ˆ Monitoring Tools: Grafana, Elasticsearch, Prometheus, or Wavefront
  • πŸ”„ CI/CD Pipelines: Jenkins, GitHub Actions, or CircleCI
  • πŸ› οΈ Version Control: Git

πŸ“ Enhancement Note: This role requires a strong understanding of cloud infrastructure, data pipelines, and big data technologies. Familiarity with the specified technologies and tools is essential for success in this role.

πŸ‘₯ Team Culture & Values

🌐 Web Development Values:

  • πŸ’‘ Innovation: Groupon values innovation and encourages team members to think creatively and challenge the status quo.
  • 🌟 Collaboration: Groupon fosters a culture of collaboration, with team members working together to achieve common goals and drive business success.
  • πŸ’» Technical Excellence: Groupon prioritizes technical excellence and encourages team members to continuously learn and improve their skills.
  • 🌍 Inclusivity: Groupon values diversity and inclusivity, promoting a work environment where everyone feels valued and respected.

🀝 Collaboration Style:

  • 🌐 Global Perspective: Working with a globally distributed team exposes professionals to diverse perspectives, cultures, and ways of working, fostering a rich and inclusive work environment.
  • 🀝 Cross-Functional Collaboration: The team works closely with various departments, including Product, Engineering, and Marketing, to ensure data-driven initiatives align with business objectives and user needs.
  • πŸ’‘ Knowledge Sharing: The team encourages knowledge sharing, technical mentoring, and continuous learning to help team members grow both personally and professionally.

πŸ“ Enhancement Note: Groupon's culture values innovation, collaboration, and technical excellence, providing an ideal environment for professionals seeking to grow and make a significant impact on a global scale.

⚑️ Challenges & Growth Opportunities

🌐 Technical Challenges:

  • 🏒 Infrastructure Design: Designing, building, and automating scalable data infrastructure on GCP or AWS using an infrastructure as code approach with Terraform presents a significant technical challenge.
  • πŸ”’ Security & Compliance: Ensuring platform reliability, security, and cost-efficiency through robust monitoring, automation, and compliance practices requires a deep understanding of data security, access controls, and compliance standards.
  • πŸ“ˆ Data Pipelines: Developing and maintaining secure, scalable data pipelines using big data technologies like Google Dataproc, BigQuery, Spark, or Airflow requires expertise in data processing, transformation, and analysis.
  • 🌐 Cloud Architecture: Designing and implementing reliable, cost-efficient, and secure data platforms on GCP or AWS presents a significant technical challenge, requiring a strong understanding of cloud architecture, cloud providers, and cloud best practices.

🌱 Learning & Development Opportunities:

  • πŸ’‘ Technical Skill Development: Working on cutting-edge technologies and collaborating with a diverse team of professionals offers ample opportunities for continuous learning and professional development.
  • 🌟 Leadership Development: As the first Data Platform Infrastructure Engineer on the team, this role presents an opportunity to shape the team's technical direction and contribute to the development of best practices and standards, fostering leadership and mentorship skills.
  • πŸ“ˆ Data-Driven Decision Making: Working in a data-driven organization provides opportunities to develop skills in data analysis, data visualization, and data-driven decision-making, enabling professionals to make informed decisions and drive business success.

πŸ“ Enhancement Note: This role offers numerous technical challenges and growth opportunities, providing professionals with the chance to expand their skills, contribute to a global team, and drive Groupon's AI-first strategy.

πŸ’‘ Interview Preparation

πŸ“ Technical Questions:

  • 🏒 Infrastructure Design: Be prepared to discuss your approach to designing, building, and automating scalable data infrastructure on GCP or AWS using an infrastructure as code approach with Terraform.
  • πŸ”’ Security & Compliance: Demonstrate your understanding of data security, access controls, and compliance standards by discussing your approach to implementing security best practices and ensuring compliance with relevant standards and regulations.
  • πŸ“ˆ Data Pipelines: Showcase your ability to develop and maintain secure, scalable data pipelines using big data technologies like Google Dataproc, BigQuery, Spark, or Airflow by discussing your approach to data processing, transformation, and analysis.
  • 🌐 Cloud Architecture: Illustrate your expertise in cloud architecture by presenting your approach to designing and implementing reliable, cost-efficient, and secure data platforms on GCP or AWS. Be prepared to discuss your experience with cloud providers, cloud best practices, and architectural patterns.

πŸ“ Company & Culture Questions:

  • 🌟 Innovation: Be prepared to discuss your approach to innovation and how you have driven technical excellence and business success in previous roles.
  • 🀝 Collaboration: Demonstrate your understanding of collaboration and how you have worked effectively with globally distributed teams to achieve common goals and drive business success.
  • 🌍 Inclusivity: Showcase your commitment to diversity and inclusivity by discussing your experience working with and supporting team members from various backgrounds and cultures.

πŸ“ Portfolio Presentation Strategy:

  • 🏒 Infrastructure Projects: Highlight your experience designing, building, and automating scalable data infrastructure on GCP or AWS using Terraform. Be prepared to discuss your approach to infrastructure as code, cloud architecture, and data security.
  • πŸ”’ Security & Compliance: Demonstrate your understanding of data security, access controls, and compliance standards by showcasing relevant projects or case studies. Be prepared to discuss your approach to implementing security best practices and ensuring compliance with relevant standards and regulations.
  • πŸ“ˆ Data Pipelines: Showcase your ability to develop and maintain secure, scalable data pipelines using big data technologies like Google Dataproc, BigQuery, Spark, or Airflow. Be prepared to discuss your approach to data processing, transformation, and analysis.
  • 🌐 Cloud Architecture: Illustrate your expertise in cloud architecture by presenting your approach to designing and implementing reliable, cost-efficient, and secure data platforms. Be prepared to discuss your experience with cloud providers, cloud best practices, and architectural patterns.

πŸ“ Enhancement Note: The interview process for this role is designed to assess the candidate's technical skills, cultural fit, and ability to thrive in a remote, globally distributed team. By preparing thoroughly and showcasing your expertise in cloud infrastructure, data pipelines, and big data technologies, you can increase your chances of success in the interview process.

πŸ“Œ Application Steps

To apply for this Data Platform Infrastructure Engineer position at Groupon:

  1. πŸ“ Portfolio Customization: Tailor your portfolio to highlight your experience in cloud infrastructure, data pipelines, and big data technologies. Include projects that demonstrate your expertise in designing, building, and automating scalable data infrastructure on GCP or AWS using an infrastructure as code approach with Terraform.
  2. πŸ“ Resume Optimization: Optimize your resume to emphasize your relevant skills, experiences, and achievements in cloud infrastructure, data platform engineering, or DevOps. Include specific keywords and phrases related to the role and Groupon's technology stack to improve your resume's visibility in applicant tracking systems.
  3. πŸ“ Technical Interview Preparation: Brush up on your knowledge of cloud infrastructure, data pipelines, and big data technologies. Familiarize yourself with the latest best practices and trends in cloud architecture, data security, and compliance. Practice designing and implementing data pipelines and cloud architectures using GCP or AWS.
  4. πŸ“ Company Research: Research Groupon's company culture, values, and mission to ensure a strong cultural fit. Familiarize yourself with Groupon's products, services, and business model to understand the company's data-driven initiatives and AI-first strategy.

πŸ“ Enhancement Note: By following these application steps and preparing thoroughly, you can increase your chances of success in the interview process and secure your dream role as a Data Platform Infrastructure Engineer at Groupon.


Content Guidelines (IMPORTANT: Do not include this in the output)

Web Technology-Specific Focus:

  • Tailor every section specifically to cloud infrastructure, data platform engineering, and DevOps roles.
  • Include cloud infrastructure, data pipelines, and big data technologies prominently in each section.
  • Emphasize cloud architecture, data security, and compliance standards throughout the document.
  • Address remote work, global teams, and asynchronous communication strategies.
  • Highlight the importance of infrastructure as code, automation, and continuous integration/continuous deployment (CI/CD) pipelines.

Quality Standards:

  • Ensure no content overlap between sections - each section must contain unique information.
  • Only include Enhancement Notes when making significant inferences about technical responsibilities, team structure, or company culture.
  • Be comprehensive but concise, prioritizing actionable information over descriptive text.
  • Strategically distribute cloud infrastructure, data platform engineering, and DevOps-related keywords throughout all sections naturally.
  • Provide realistic salary ranges based on location, experience level, and cloud infrastructure/data platform engineering specialization.

Industry Expertise:

  • Include specific cloud infrastructure, data platform engineering, and DevOps technologies, tools, and methodologies relevant to the role.
  • Address cloud infrastructure, data pipelines, and big data technologies prominently in each section.
  • Highlight the importance of cloud architecture, data security, and compliance standards in each section.
  • Provide tactical advice for cloud infrastructure, data platform engineering, and DevOps interview preparation, coding challenges, and portfolio presentation.

Professional Standards:

  • Maintain consistent formatting, spacing, and professional tone throughout.
  • Use cloud infrastructure, data platform engineering, and DevOps industry terminology appropriately and accurately.
  • Include comprehensive benefits and growth opportunities relevant to cloud infrastructure, data platform engineering, and DevOps professionals.
  • Provide actionable insights that give cloud infrastructure, data platform engineering, and DevOps candidates a competitive advantage.
  • Focus on cloud infrastructure, data platform engineering, and DevOps team culture, cross-functional collaboration, and user impact measurement.

Technical Focus & Portfolio Emphasis:

  • Emphasize cloud infrastructure, data pipelines, and big data technologies prominently in each section.
  • Include specific portfolio requirements tailored to cloud infrastructure, data platform engineering, and DevOps roles.
  • Address cloud architecture, data security, and compliance standards in portfolio requirements and technical documentation sections.
  • Focus on problem-solving methods, performance optimization, and scalable cloud architecture in portfolio projects.
  • Include technical presentation skills and stakeholder communication for cloud infrastructure, data platform engineering, and DevOps projects.

Avoid:

  • Generic business jargon not relevant to cloud infrastructure, data platform engineering, or DevOps roles.
  • Placeholder text or incomplete sections.
  • Repetitive content across different sections.
  • Non-technical terminology unless relevant to the specific cloud infrastructure, data platform engineering, or DevOps role.
  • Marketing language unrelated to cloud infrastructure, data platform engineering, or DevOps roles and user experience.

Application Requirements

Candidates should have a Bachelor’s or Master’s degree in a related field and at least 5 years of hands-on experience in relevant areas. Strong expertise in cloud platforms, programming skills, and knowledge of Big Data frameworks are essential.