Data Platform Engineer

ShipHero
Full_time

πŸ“ Job Overview

  • Job Title: Data Platform Engineer
  • Company: ShipHero
  • Location: Argentina
  • Job Type: Full-Time, Remote
  • Category: DevOps Engineer, Data Engineer
  • Date Posted: 2025-07-30
  • Experience Level: Mid-Level (2-5 years)

πŸš€ Role Summary

  • πŸ“ Enhancement Note: This role involves operating, improving, and extending ShipHero's data infrastructure, supporting engineering teams, and optimizing databases for performance and scalability. It requires a strong background in data storage, processing, and stream processing, with experience in AWS or GCP and infrastructure-as-code tools.

πŸ’» Primary Responsibilities

  • πŸ“ Enhancement Note: The primary responsibilities revolve around data infrastructure management, optimization, and extension. This includes operational optimization, automation, and supporting engineering teams in enhancing data services' performance.

  • Operate, improve, and extend ShipHero’s data infrastructure.

  • Support engineering teams in enhancing the performance of reports, searches, and other data services.

  • Be part of the Platform Team to address performance and reliability challenges in our data systems.

  • Optimize both relational and non-relational databases for performance and scalability.

  • Automate infrastructure management using infrastructure-as-code for consistency, scalability, and repeatability.

  • Rotative On-call rotation 24x7.

πŸŽ“ Skills & Qualifications

Education: Bachelor's degree in Computer Science, Engineering, or a related field.

Experience: Proven experience (2-5 years) in operating and improving data storage and processing systems.

Required Skills:

  • Solid experience with data storage and processing systems.
  • Operational optimization (indexing, query tuning, monitoring).
  • Familiarity with stream processing and tools like Kafka, Kinesis.
  • Experience with AWS or GCP and infrastructure-as-code tools (Terraform, Ansible).
  • Proficiency in Python, Airflow, Redshift, and dbt.
  • πŸ“ Enhancement Note: Experience with the AWS data ecosystem (Aurora MySQL, DocumentDB/MongoDB, OpenSearch/ElasticSearch, Redshift, Glue/Spark, MSK/Kafka, Kinesis, Debezium, S3/Apache Hudi, MWAA/Airflow) is highly appreciated.

Preferred Skills:

  • Experience with data warehouses, data lakes, and distributed processing systems.
  • Knowledge of infrastructure-as-code best practices.
  • Familiarity with data modeling and ETL processes.

πŸ“Š Web Portfolio & Project Requirements

Portfolio Essentials:

  • Demonstrate experience in operating and improving data storage and processing systems.
  • Showcase projects involving stream processing, AWS or GCP services, and infrastructure-as-code tools.
  • Highlight problem-solving skills and database optimization techniques.

Technical Documentation:

  • Provide documentation of past projects, highlighting data infrastructure design, optimization strategies, and performance metrics.
  • Include code samples demonstrating operational optimization techniques and stream processing implementations.

πŸ’΅ Compensation & Benefits

Salary Range: $80,000 - $120,000 USD per year (Based on regional market research and role requirements)

Benefits:

  • Equipment Budget: $2,500 to purchase necessary equipment for the role.
  • Paid Vacation: 20 days, plus additional time off for New Year's and Christmas.
  • Conference Days: Paid days off to attend conferences and stay up-to-date with industry trends.
  • Learning Opportunities: ShipHero will pay for courses and conferences to support continuous learning.

Working Hours: Full-time (40 hours/week) with a flexible schedule and a 24x7 on-call rotation.

🎯 Team & Company Context

🏒 Company Culture

Industry: E-commerce and logistics technology.

Company Size: Medium-sized company with a fully remote team.

Founded: 2014 (Company history and growth information not provided)

Team Structure:

  • The Platform Team focuses on data infrastructure, performance, and reliability.
  • Cross-functional collaboration with engineering teams to enhance data services' performance.

Development Methodology:

  • Agile development methodologies with a focus on continuous improvement and collaboration.
  • Infrastructure-as-code practices for consistency, scalability, and repeatability.

Company Website: ShipHero

πŸ“ Enhancement Note: ShipHero values collaboration, learning, and continuous improvement, as reflected in their remote work culture and emphasis on asynchronous work.

πŸ“ˆ Career & Growth Analysis

Web Technology Career Level: Mid-Level Data Platform Engineer, responsible for operating, improving, and extending data infrastructure while supporting engineering teams.

Reporting Structure: Reports directly to the Platform Team, with cross-functional collaboration with engineering teams.

Technical Impact: Significant impact on data infrastructure performance, reliability, and scalability, directly influencing the user experience and business operations of ShipHero's e-commerce customers.

Growth Opportunities:

  • Technical growth opportunities in data infrastructure, stream processing, and AWS or GCP services.
  • Potential leadership roles within the Platform Team or broader engineering organization.
  • Opportunities to learn and work with emerging technologies in the data and e-commerce domains.

πŸ“ Enhancement Note: ShipHero's growth opportunities are tied to the company's expansion and the increasing complexity of its data infrastructure, offering technical and leadership growth paths for the right candidate.

🌐 Work Environment

Office Type: Fully remote work environment with a strong emphasis on asynchronous work and deep focus time.

Office Location(s): Argentina (Remote work allows for flexibility in location within Argentina)

Workspace Context:

  • Remote work allows for a flexible workspace tailored to individual preferences.
  • Emphasis on asynchronous work and deep focus time for uninterrupted productivity.
  • Collaborative work environment with regular video chat and Slack communication.

Work Schedule: Flexible work schedule with a focus on results and a 24x7 on-call rotation.

πŸ“ Enhancement Note: ShipHero's remote work environment encourages autonomy, flexibility, and deep focus, allowing employees to balance work and personal life effectively.

πŸ“„ Application & Technical Interview Process

Interview Process:

  1. Technical Screening: A conversation focused on data infrastructure, stream processing, and AWS or GCP services.
  2. System Design Challenge: A hands-on exercise to evaluate problem-solving skills and database optimization techniques.
  3. Behavioral Interview: A discussion focused on collaboration, learning, and adaptability.
  4. Final Interview: A conversation with the hiring manager to assess cultural fit and answer any remaining questions.

Portfolio Review Tips:

  • Highlight projects demonstrating experience in data storage, processing, and stream processing.
  • Showcase problem-solving skills and database optimization techniques.
  • Emphasize any experience with AWS or GCP services and infrastructure-as-code tools.

Technical Challenge Preparation:

  • Brush up on data infrastructure concepts, stream processing, and AWS or GCP services.
  • Practice database optimization techniques and problem-solving exercises.
  • Familiarize yourself with ShipHero's technology stack and e-commerce domain.

ATS Keywords: (Organized by category)

Programming Languages: Python, Bash, SQL

Web Frameworks & Libraries: Airflow, dbt, Terraform, Ansible

Server Technologies: AWS (Aurora MySQL, DocumentDB/MongoDB, OpenSearch/ElasticSearch, Redshift, Glue/Spark, MSK/Kafka, Kinesis, Debezium, S3/Apache Hudi, MWAA/Airflow), GCP

Databases: Relational databases, data warehouses, data lakes, distributed processing systems

Tools: Infrastructure-as-code tools (Terraform, Ansible), stream processing tools (Kafka, Kinesis)

Methodologies: Agile, infrastructure-as-code best practices

Soft Skills: Collaboration, learning, adaptability, problem-solving

Industry Terms: E-commerce, logistics technology, data infrastructure, stream processing, AWS, GCP

πŸ“ Enhancement Note: ShipHero's interview process focuses on technical skills, problem-solving, and cultural fit, with a strong emphasis on data infrastructure, stream processing, and AWS or GCP services.

πŸ›  Technology Stack & Web Infrastructure

Data Infrastructure Technologies:

  • AWS services (Aurora MySQL, DocumentDB/MongoDB, OpenSearch/ElasticSearch, Redshift, Glue/Spark, MSK/Kafka, Kinesis, Debezium, S3/Apache Hudi, MWAA/Airflow)
  • GCP services (if applicable)
  • Relational databases, data warehouses, data lakes, distributed processing systems
  • Stream processing tools (Kafka, Kinesis)

Programming Languages & Tools:

  • Python
  • Bash
  • SQL
  • Infrastructure-as-code tools (Terraform, Ansible)
  • Airflow
  • dbt

πŸ“ Enhancement Note: ShipHero's technology stack focuses on AWS or GCP services, data infrastructure, and stream processing, with a strong emphasis on infrastructure-as-code practices.

πŸ‘₯ Team Culture & Values

Web Development Values:

  • Collaboration: ShipHero values collaboration and teamwork, with a strong emphasis on asynchronous work and deep focus time.
  • Learning: ShipHero encourages continuous learning and staying up-to-date with industry trends.
  • Adaptability: ShipHero values adaptability and the ability to learn from mistakes.

Collaboration Style:

  • Cross-functional collaboration with engineering teams to enhance data services' performance.
  • Regular video chat and Slack communication to maintain a collaborative remote work environment.
  • Infrastructure-as-code practices for consistency, scalability, and repeatability.

πŸ“ Enhancement Note: ShipHero's web development values emphasize collaboration, learning, and adaptability, with a strong focus on asynchronous work and deep focus time in a remote work environment.

⚑ Challenges & Growth Opportunities

Technical Challenges:

  • Operating and improving data storage and processing systems in a rapidly growing e-commerce environment.
  • Optimizing both relational and non-relational databases for performance and scalability.
  • Automating infrastructure management using infrastructure-as-code for consistency, scalability, and repeatability.
  • Addressing performance and reliability challenges in data systems with a 24x7 on-call rotation.

Learning & Development Opportunities:

  • Technical growth opportunities in data infrastructure, stream processing, and AWS or GCP services.
  • Potential leadership roles within the Platform Team or broader engineering organization.
  • Opportunities to learn and work with emerging technologies in the data and e-commerce domains.

πŸ“ Enhancement Note: ShipHero's technical challenges and growth opportunities are tied to the company's expansion and the increasing complexity of its data infrastructure, offering technical and leadership growth paths for the right candidate.

πŸ’‘ Interview Preparation

Technical Questions:

  1. Data Infrastructure: Describe your experience with data storage and processing systems, and how you've optimized them for performance and scalability.
  2. Stream Processing: Explain your experience with stream processing and tools like Kafka or Kinesis, and how you've used them to address business challenges.
  3. AWS/GCP Services: Discuss your experience with AWS or GCP services, and how you've used them to build and manage data infrastructure.
  4. Problem-Solving: Describe a challenging data infrastructure problem you've faced and how you solved it, highlighting your problem-solving skills and database optimization techniques.

Company & Culture Questions:

  1. Collaboration: Describe your experience working in a remote, asynchronous environment and how you maintain collaboration and communication with team members.
  2. Learning: Explain how you stay up-to-date with industry trends and new technologies, and how you apply this knowledge to your work.
  3. Adaptability: Discuss a time when you had to adapt to a significant change in your work environment or technology stack, and how you approached the situation.

Portfolio Presentation Strategy:

  • Highlight projects demonstrating experience in data storage, processing, and stream processing.
  • Showcase problem-solving skills and database optimization techniques.
  • Emphasize any experience with AWS or GCP services and infrastructure-as-code tools.
  • Tailor your portfolio to ShipHero's technology stack and e-commerce domain.

πŸ“ Enhancement Note: ShipHero's interview preparation focuses on technical skills, problem-solving, and cultural fit, with a strong emphasis on data infrastructure, stream processing, and AWS or GCP services.

πŸ“Œ Application Steps

To apply for this Data Platform Engineer position:

  1. Customize your resume and portfolio to highlight your experience with data infrastructure, stream processing, and AWS or GCP services.
  2. Research ShipHero's technology stack, e-commerce domain, and company culture to demonstrate your understanding and fit for the role.
  3. Prepare for the technical screening, system design challenge, behavioral interview, and final interview by brushing up on your technical skills and practicing problem-solving exercises.
  4. Submit your application through the application link provided.

⚠️ Important Notice: This enhanced job description includes AI-generated insights and industry-standard assumptions. All details should be verified directly with the hiring organization before making application decisions.

Application Requirements

Candidates should have solid experience with data storage and processing systems, including operational optimization. Familiarity with AWS or GCP and tools like Terraform or Ansible is essential, along with a passion for learning and collaboration.