Senior Data Platform Engineer
📍 Job Overview
- Job Title: Senior Data Platform Engineer
- Company: Peridot Group
- Location: Budapest, Hungary (Hybrid)
- Job Type: On-site
- Category: Data Engineering
- Date Posted: June 25, 2025
- Experience Level: Mid-Senior level (5-10 years)
🚀 Role Summary
- Design, build, and maintain scalable data infrastructure and warehousing solutions
- Collaborate with various teams to understand data needs and provide innovative solutions
- Ensure data governance, privacy, security, and compliance requirements are met
- Mentor colleagues to improve team capabilities and drive product innovation
📝 Enhancement Note: This role requires a strong technical background in data modeling, data warehousing, and cloud-based analytics to drive data-driven decision making across the organization.
💻 Primary Responsibilities
-
Data Infrastructure & Platform:
- Design, build, and optimize scalable data warehousing solutions and data infrastructure
- Create and maintain robust ELT processes for data integration from multiple sources
- Develop and maintain data pipelines ensuring scalability, reliability, quality, and efficiency
-
Data Architecture & Modeling:
- Design and implement data models following best practices and industry standards
- Optimize data warehouse performance and ensure data integrity across all systems
- Participate in data architecture and engineering decisions, bringing strong experience and knowledge to technical discussions
- Track key business metrics and KPIs
-
Data Governance & Quality:
- Establish and maintain data governance frameworks, including data lineage tracking, quality monitoring, and metadata management
- Implement data quality checks, validation rules, and automated testing to ensure data accuracy and consistency
- Define and enforce data standards, naming conventions, and documentation practices across all data assets
-
Collaboration & Leadership:
- Work collaboratively with various teams including Product, Operations, and Finance to understand data needs and provide solutions
- Provide mentorship by offering feedback and guiding colleagues to improve team capabilities
- Translate business requirements into technical solutions and drive product innovation
- Ensure data governance, privacy, security standards, and compliance requirements are met
🎓 Skills & Qualifications
Education: Bachelor's degree in Computer Science, Engineering, or a related field. Relevant experience may be considered in lieu of a degree.
Experience: 5+ years of proven data warehouse experience in designing, implementing, and managing large-scale data systems.
Required Skills:
- 5+ years of hands-on SQL experience with expertise in analytical query and database performance tuning
- 5+ years of proven data warehouse experience in designing, implementing, and managing large-scale data systems
- Strong understanding of data modeling concepts, including dimensional data modeling, schema design, and data architecture patterns
- 5+ years of experience with at least one major cloud provider and their data services
- Proficiency with data orchestration tools such as Apache Airflow, Dagster, or similar workflow management systems
- 5+ years of hands-on Python programming skills using Pandas or Polars, venv
- Experience with modern cloud data platforms such as Snowflake, Redshift, Databricks, BigQuery
Preferred Skills:
- Extensive experience with DBT, including model design, testing, performance tuning, and CI/CD best practices
- Hands-on experience with DBT workflow integration with cloud services and orchestration tools
- Business analyst mindset with the ability to translate business needs into technical solutions and communicate insights effectively
- Snowflake expertise, including experience with data loads/unloads, table design, permission models, views, and DDL/DML queries
- Business Intelligence tools experience, including Power BI, Tableau, or similar visualization platforms, with expertise in DAX, Power Query
- Experience with infrastructure as code (Terraform, AWS CDK, KDE)
📊 Web Portfolio & Project Requirements
-
Portfolio Essentials:
- Demonstrate expertise in data modeling, data warehousing, and cloud-based analytics through previous projects
- Showcase ability to design, build, and maintain scalable data infrastructure and warehousing solutions
- Highlight experience in creating robust ELT processes and developing data pipelines
- Display understanding of data governance, privacy, security, and compliance requirements
-
Technical Documentation:
- Provide well-commented and well-documented code, including version control, deployment processes, and server configuration
- Include testing methodologies, performance metrics, and optimization techniques in your portfolio
- Showcase data lineage tracking, quality monitoring, and metadata management skills
💵 Compensation & Benefits
Salary Range: €70,000 - €90,000 per year (Based on market research for Senior Data Engineers in Budapest, Hungary)
Benefits:
- Competitive salary and benefits package
- Opportunity to work in a global and dynamic environment
- Collaborative and supportive team culture
- Opportunities for professional growth and development
Working Hours: Full-time (40 hours/week), with flexibility for project deadlines and maintenance windows
📝 Enhancement Note: The salary range provided is based on market research for Senior Data Engineers in Budapest, Hungary. The actual salary may vary depending on the candidate's experience and skills.
🎯 Team & Company Context
🏢 Company Culture
Industry: Financial Services - Global provider of working capital solutions
Company Size: Medium to Large (75+ countries, over 250 employees)
Founded: 2007 (18 years ago)
Team Structure:
- Collaborative and cross-functional teams, including Product, Operations, Finance, and Data
- Flat hierarchy with a strong focus on mentorship and knowledge sharing
- Global presence with remote and hybrid work arrangements
Development Methodology:
- Agile/Scrum methodologies for project management and sprint planning
- Code review, testing, and quality assurance practices
- Deployment strategies, CI/CD pipelines, and server management
Company Website: www.gscf.com
📝 Enhancement Note: GSCF's global presence and collaborative team culture provide ample opportunities for professional growth and exposure to diverse data challenges.
📈 Career & Growth Analysis
Web Technology Career Level: Senior Data Engineer - Responsible for designing, building, and maintaining large-scale data systems, ensuring data quality, and driving data-driven decision making across the organization.
Reporting Structure: This role reports directly to the Head of Data and will work closely with various teams, including Product, Operations, and Finance.
Technical Impact: The Senior Data Platform Engineer will have a significant impact on data-driven decision making, ensuring data accuracy, consistency, and availability across the organization.
Growth Opportunities:
- Career Progression: Opportunities for advancement to Principal or Director level roles within the Data team or other departments within the company.
- Technical Skill Development: Continuous learning and skill development in emerging data technologies, cloud services, and data governance practices.
- Leadership Potential: Mentorship opportunities and the chance to drive product innovation and architecture decisions.
📝 Enhancement Note: GSCF's global presence and collaborative team culture provide ample opportunities for professional growth and exposure to diverse data challenges.
🌐 Work Environment
Office Type: Hybrid - A combination of on-site and remote work arrangements, with a focus on collaboration and team building.
Office Location(s): Budapest, Hungary - With a global presence, there may be opportunities for travel and remote work with international teams.
Workspace Context:
- Collaborative workspaces designed for team interaction and knowledge sharing
- Access to modern development tools, multiple monitors, and testing devices
- Cross-functional collaboration with designers, marketers, and other stakeholders
Work Schedule: Full-time (40 hours/week), with flexibility for project deadlines and maintenance windows. The work schedule may vary depending on the project and team needs.
📝 Enhancement Note: GSCF's hybrid work environment encourages collaboration and team building while providing the flexibility to balance work and personal life.
📄 Application & Technical Interview Process
Interview Process:
- Technical Assessment: A hands-on technical assessment focused on SQL, data modeling, and cloud-based analytics. Candidates will be asked to design, implement, and optimize data warehousing solutions and data pipelines.
- Architecture Discussion: A discussion focused on data architecture and engineering decisions, where candidates will be expected to demonstrate strong experience and knowledge in data modeling, data warehousing, and cloud-based analytics.
- Behavioral Interview: An interview focused on problem-solving, communication, and collaboration skills. Candidates will be asked to provide examples of their ability to work collaboratively with various teams and drive product innovation.
- Final Evaluation: A final evaluation based on the candidate's technical skills, cultural fit, and potential for growth within the organization.
Portfolio Review Tips:
- Highlight your expertise in data modeling, data warehaging, and cloud-based analytics through previous projects
- Showcase your ability to design, build, and maintain scalable data infrastructure and warehousing solutions
- Demonstrate your experience in creating robust ELT processes and developing data pipelines
- Include data lineage tracking, quality monitoring, and metadata management skills in your portfolio
Technical Challenge Preparation:
- Brush up on your SQL skills, focusing on analytical query and database performance tuning
- Familiarize yourself with data modeling concepts, including dimensional data modeling, schema design, and data architecture patterns
- Review your experience with cloud services, data orchestration tools, and modern cloud data platforms such as Snowflake, Redshift, Databricks, and BigQuery
- Prepare for questions on data governance, privacy, security, and compliance requirements
ATS Keywords: (Organized by category)
- Programming Languages: SQL, Python, Pandas, Polars, DAX, Power Query
- Web Frameworks: DBT, Apache Airflow, Dagster, Snowflake, Redshift, Databricks, BigQuery
- Server Technologies: Cloud Services (AWS, GCP, Azure), Terraform, AWS CDK, KDE
- Databases: Snowflake, Redshift, Databricks, BigQuery
- Tools: Power BI, Tableau, Venv
- Methodologies: Agile, Scrum, CI/CD, Infrastructure as Code
- Soft Skills: Collaboration, Mentorship, Communication, Problem-solving
- Industry Terms: Data Warehouse, Data Modeling, Data Governance, Data Quality, Data Integration, Data Pipeline, ELT, ETL, Cloud-based Analytics
📝 Enhancement Note: The interview process for this role is designed to assess the candidate's technical skills, problem-solving abilities, and cultural fit within the organization. Successful candidates will demonstrate a strong background in data modeling, data warehaging, and cloud-based analytics, as well as excellent communication and collaboration skills.
🛠 Technology Stack & Web Infrastructure
Data Warehousing & Infrastructure:
- Modern cloud data platforms: Snowflake, Redshift, Databricks, BigQuery
- Data orchestration tools: Apache Airflow, Dagster
- Infrastructure as Code (IaC) tools: Terraform, AWS CDK, KDE
Data Modeling & Architecture:
- Data modeling concepts: Dimensional data modeling, schema design, data architecture patterns
- Data governance frameworks: Data lineage tracking, quality monitoring, metadata management
- Data quality checks, validation rules, and automated testing
Programming Languages & Tools:
- SQL: Analytical query, database performance tuning
- Python: Data manipulation, analysis, and visualization
- Pandas, Polars: Data manipulation and analysis libraries
- DBT: Data transformation and testing tool
- Power BI, Tableau: Business Intelligence tools for data visualization and reporting
Cloud Services:
- Major cloud providers: AWS, GCP, Azure
- Cloud-based analytics: Data processing, storage, and querying in the cloud
Version Control & Deployment:
- Version control systems: Git, GitHub, GitLab
- Deployment strategies: CI/CD pipelines, automated deployment, infrastructure as code
👥 Team Culture & Values
Data Team Values:
- Expertise: Deep technical expertise in data modeling, data warehaging, and cloud-based analytics
- Collaboration: Strong communication and collaboration skills, with a focus on driving product innovation and architecture decisions
- Mentorship: A commitment to knowledge sharing and mentoring colleagues to improve team capabilities
- Data-driven: A data-driven mindset, with a focus on using data to drive business decisions and improve performance
Collaboration Style:
- Cross-functional Integration: Collaborative work with various teams, including Product, Operations, Finance, and other stakeholders
- Code Review Culture: Regular code reviews and peer programming practices to ensure data quality and consistency
- Knowledge Sharing: A culture of knowledge sharing, continuous learning, and mentorship
📝 Enhancement Note: GSCF's data team values expertise, collaboration, and a data-driven mindset, with a strong focus on mentorship and knowledge sharing.
🌐 Work Environment
Office Type: Hybrid - A combination of on-site and remote work arrangements, with a focus on collaboration and team building.
Office Location(s): Budapest, Hungary - With a global presence, there may be opportunities for travel and remote work with international teams.
Workspace Context:
- Collaborative workspaces designed for team interaction and knowledge sharing
- Access to modern development tools, multiple monitors, and testing devices
- Cross-functional collaboration with designers, marketers, and other stakeholders
Work Schedule: Full-time (40 hours/week), with flexibility for project deadlines and maintenance windows. The work schedule may vary depending on the project and team needs.
📝 Enhancement Note: GSCF's hybrid work environment encourages collaboration and team building while providing the flexibility to balance work and personal life.
🛠 Technology Stack & Web Infrastructure
Data Warehousing & Infrastructure:
- Modern cloud data platforms: Snowflake, Redshift, Databricks, BigQuery
- Data orchestration tools: Apache Airflow, Dagster
- Infrastructure as Code (IaC) tools: Terraform, AWS CDK, KDE
Data Modeling & Architecture:
- Data modeling concepts: Dimensional data modeling, schema design, data architecture patterns
- Data governance frameworks: Data lineage tracking, quality monitoring, metadata management
- Data quality checks, validation rules, and automated testing
Programming Languages & Tools:
- SQL: Analytical query, database performance tuning
- Python: Data manipulation, analysis, and visualization
- Pandas, Polars: Data manipulation and analysis libraries
- DBT: Data transformation and testing tool
- Power BI, Tableau: Business Intelligence tools for data visualization and reporting
Cloud Services:
- Major cloud providers: AWS, GCP, Azure
- Cloud-based analytics: Data processing, storage, and querying in the cloud
Version Control & Deployment:
- Version control systems: Git, GitHub, GitLab
- Deployment strategies: CI/CD pipelines, automated deployment, infrastructure as code
💡 Interview Preparation
Technical Questions:
-
Data Warehousing & Infrastructure:
- Describe your experience with modern cloud data platforms such as Snowflake, Redshift, Databricks, and BigQuery.
- How have you optimized data warehouse performance and ensured data integrity across all systems?
- Can you walk us through your experience with data orchestration tools like Apache Airflow or Dagster?
- How have you implemented data quality checks, validation rules, and automated testing in your previous roles?
-
Data Modeling & Architecture:
- Can you explain your understanding of data modeling concepts, including dimensional data modeling, schema design, and data architecture patterns?
- How have you designed and implemented data models following best practices and industry standards?
- Can you describe a complex data architecture challenge you've faced and how you approached it?
-
Collaboration & Leadership:
- How have you worked collaboratively with various teams, including Product, Operations, and Finance, to understand data needs and provide solutions?
- Can you provide an example of a time when you mentored a colleague to improve their technical capabilities?
- How have you translated business requirements into technical solutions and driven product innovation in your previous roles?
Company & Culture Questions:
- What attracts you to GSCF and this role specifically?
- How do you see yourself contributing to our data team's values and collaboration style?
- Can you describe a time when you had to adapt to a significant change in your work environment, and how you handled it?
Portfolio Presentation Strategy:
- Highlight your expertise in data modeling, data warehousing, and cloud-based analytics through previous projects
- Showcase your ability to design, build, and maintain scalable data infrastructure and warehousing solutions
- Demonstrate your experience in creating robust ELT processes and developing data pipelines
- Include data lineage tracking, quality monitoring, and metadata management skills in your portfolio
- Tailor your presentation to GSCF's data team values and collaboration style, emphasizing your fit within the organization
📝 Enhancement Note: The interview process for this role is designed to assess the candidate's technical skills, problem-solving abilities, and cultural fit within the organization. Successful candidates will demonstrate a strong background in data modeling, data warehaging, and cloud-based analytics, as well as excellent communication and collaboration skills.
📌 Application Steps
To apply for this Senior Data Platform Engineer position at GSCF:
- Customize Your Portfolio: Tailor your portfolio to highlight your expertise in data modeling, data warehousing, and cloud-based analytics, with a focus on scalable data infrastructure and warehousing solutions, robust ELT processes, and data pipeline development.
- Optimize Your Resume: Highlight your relevant experience and skills, including data modeling, data warehousing, cloud services, data orchestration tools, and modern cloud data platforms. Emphasize your problem-solving abilities, communication skills, and cultural fit within the organization.
- Prepare for Technical Assessment: Brush up on your SQL skills, focusing on analytical query and database performance tuning. Review your experience with data modeling concepts, cloud services, data orchestration tools, and modern cloud data platforms. Familiarize yourself with data governance, privacy, security, and compliance requirements.
- Research the Company: Learn about GSCF's global presence, collaborative team culture, and data-driven approach to decision making. Understand the company's role in the financial services industry and its commitment to working capital solutions.
⚠️ Important Notice: This enhanced job description includes AI-generated insights and web technology industry-standard assumptions. All details should be verified directly with the hiring organization before making application decisions.
Application Requirements
Candidates should have over 5 years of experience in SQL, data warehousing, and cloud services. Strong skills in data modeling, Python programming, and data orchestration tools are also required.