Data Platform Engineer

Marshalls PLC
Full_timeElland, United Kingdom

📍 Job Overview

  • Job Title: Data Platform Engineer
  • Company: Marshalls PLC
  • Location: Elland, England, United Kingdom
  • Job Type: Hybrid (On-site & Remote)
  • Category: Data Engineering
  • Date Posted: 2025-08-08
  • Experience Level: Mid-Senior level (5-10 years)
  • Remote Status: Hybrid (On-site & Remote)

🚀 Role Summary

  • Key Responsibilities: Design, develop, and maintain robust, secure, and reliable data pipelines for key enterprise systems, including ERP, CRM, and analytics platforms. Collaborate with IT, BI, and operational teams to support a resilient, scalable, and governed data platform.
  • Key Technologies: SQL Server, SQL MI, Synapse, Data Lake, Microsoft Fabric, Dynamics AX/D365, CRM, Azure Monitor, Log Analytics, Python, PySpark, ETL/ELT frameworks, Azure DevOps, Git, Infrastructure as Code (ARM, Terraform, Bicep).

💻 Primary Responsibilities

🔑 Data Pipeline Development & Management

  • Design & Develop: Build and maintain automated, secure data pipelines from various sources, ensuring data quality, consistency, and lineage tracking.
  • Manage & Maintain: Oversee the capacity, scalability, and configuration of the data platform, including SQL Server, SQL MI, Synapse, Data Lake, and Microsoft Fabric.
  • Monitor & Administer: Implement robust alerting and performance tracking using Azure Monitor and Log Analytics. Support platform lifecycle tasks and contribute to DevOps and knowledge sharing.

🔒 Data Governance & Security

  • Govern & Secure: Ensure data governance and security by using tools like Purview and adhering to compliance standards (e.g., GDPR, ISO 27001).
  • Collaborate & Support: Work with BI teams to ensure clean, reliable data delivery to analytics models. Support the migration to Microsoft Fabric, ensuring modern, efficient data architecture.

🎓 Skills & Qualifications

📚 Education & Experience

  • Education: Relevant degree or equivalent experience in data engineering, computer science, or a related field.
  • Experience: Proven experience (5-10 years) as a Data Engineer, Data Platform Engineer, or similar role in enterprise environments.

🛠 Required Skills

  • Technical Skills: Advanced SQL development skills, Python/PySpark, experience with modern ETL/ELT frameworks, familiarity with cloud-native data platforms (Microsoft Fabric, Synapse, Data Factory), and data governance tools like Purview.
  • Domain Knowledge: Experience integrating data from ERP/CRM systems (e.g., Dynamics 365) and operational data sources. Understanding of data security and compliance (e.g., GDPR, ISO 27001).
  • Soft Skills: Strong problem-solving, diagnostic, and collaboration skills, with the ability to work across cross-functional teams.

🌟 Preferred Skills

  • Certifications: Microsoft certifications (e.g., DP-700, DP-203, DP-900) are highly desirable.
  • DevOps: Knowledge of CI/CD, Git, and Infrastructure as Code (e.g., ARM, Terraform, Bicep).

📊 Web Portfolio & Project Requirements

🎯 Portfolio Essentials

  • Data Pipeline Projects: Demonstrate experience in designing, developing, and maintaining data pipelines using relevant technologies (e.g., Synapse, Data Factory, Azure Data Factory).
  • Data Governance & Security: Showcase projects that highlight data governance, security, and compliance aspects.
  • Cloud-Native Platforms: Highlight experience with cloud-native data platforms and migrations to modern platforms like Microsoft Fabric.

📜 Technical Documentation

  • Data Lineage & Dependency Maps: Document data flows, pipeline dependencies, and technical configurations to support transparency and maintainability.
  • Code Comments & Documentation: Demonstrate clear and concise code commenting and documentation practices.

💵 Compensation & Benefits

💰 Salary Range

  • Estimate: £50,000 - £65,000 per annum (Based on UK market rates for mid-senior level data engineers with relevant experience)
  • Currency: GBP

🎁 Benefits

  • Holiday: 34 days per annum (including bank holidays)
  • Healthcare: Health care cash plan covering dental, optical, prescription costs, and more
  • Family Leave: Enhanced maternity, paternity, and adoption pay and leave
  • Pension: 5% employer matched pension scheme
  • Perks: Cycle to work scheme, employee discount on Marshalls and Marley products, retailer discounts, Marshalls Wellbeing Centre, share purchase scheme, life assurance

🎯 Team & Company Context

🏢 Company Culture

  • Industry: Building materials and hard landscaping products
  • Company Size: Large (Over 5,000 employees)
  • Founded: 1890
  • Team Structure: The Data & Insights team works closely with IT, BI, and operational teams to support a resilient, scalable, and governed data platform. The Digital Transformation Team focuses on transitioning toward a modern, unified analytics platform (Microsoft Fabric).
  • Development Methodology: Agile/Scrum methodologies, with a focus on collaboration, cross-functional integration, and continuous improvement.

📈 Career & Growth Analysis

  • Web Technology Career Level: Mid-Senior level data engineering role with opportunities for growth in technical leadership, architecture, and data governance.
  • Reporting Structure: The Data Platform Engineer reports to the Data Architect and collaborates with IT, BI, and operational teams.
  • Technical Impact: The role has a significant impact on data quality, consistency, and availability, supporting decision-making across the business.

🌐 Work Environment

  • Office Type: Hybrid (On-site & Remote) with flexible working arrangements
  • Office Location(s): Elland, England, United Kingdom
  • Workspace Context: The workspace is collaborative, with a focus on knowledge sharing, technical mentoring, and continuous learning. The team uses modern development tools, multiple monitors, and testing devices to ensure high-quality data delivery.
  • Work Schedule: Full-time (40 hours per week) with flexible deployment windows, maintenance, and project deadlines.

📄 Application & Technical Interview Process

📝 Interview Process

  1. Technical Phone Screen: A brief phone or video call to discuss your technical background and fit for the role.
  2. Technical Deep Dive: A detailed technical interview focusing on your data engineering skills, problem-solving abilities, and experience with relevant technologies.
  3. Behavioral & Cultural Fit: An interview to assess your soft skills, collaboration abilities, and cultural fit within the team and organization.
  4. Final Review & Decision: A final review of your application, portfolio, and interview performance to make a hiring decision.

📝 Portfolio Review Tips

  • Data Pipeline Projects: Highlight your experience in designing, developing, and maintaining data pipelines using relevant technologies.
  • Data Governance & Security: Emphasize your understanding of data governance, security, and compliance aspects in your projects.
  • Cloud-Native Platforms: Showcase your experience with cloud-native data platforms and migrations to modern platforms like Microsoft Fabric.

📝 Technical Challenge Preparation

  • Data Pipeline Exercises: Practice designing and implementing data pipelines using relevant technologies (e.g., Synapse, Data Factory, Azure Data Factory).
  • Data Governance & Security Scenarios: Prepare for scenarios that involve data governance, security, and compliance aspects.
  • Problem-Solving & Collaboration: Brush up on your problem-solving skills and be ready to discuss your approach to collaboration and cross-functional teamwork.

🛠 Technology Stack & Web Infrastructure

💻 Frontend Technologies (Not applicable for this role)

  • Not applicable for this role

🔧 Backend & Server Technologies

  • SQL Server: Relational database management system for on-premises and cloud-based data storage and retrieval.
  • SQL MI (SQL Managed Instance): Fully-managed, cloud-based instance of SQL Server with built-in high availability and disaster recovery capabilities.
  • Synapse: A limitless analytics service that brings together an ecosystem of best-in-class analytics tools, data warehouses, and big data analytics.
  • Data Lake: A storage layer for big data analytics that is optimized for high-speed, parallel processing of large data sets.
  • Microsoft Fabric: A unified analytics platform that brings together data integration, data warehousing, and data governance capabilities.

🛠 Development & DevOps Tools

  • Azure DevOps: A set of development, project management, and version control tools for collaborative software development.
  • Git: A distributed version control system that enables multiple developers to work together on a project.
  • Infrastructure as Code (IaC): Tools like ARM, Terraform, and Bicep enable the provisioning and management of infrastructure through code, ensuring consistency and automation.

👥 Team Culture & Values

🌟 Web Development Values

  • Data Quality & Integrity: A strong focus on data quality, consistency, and lineage tracking to ensure reliable and accurate data delivery.
  • Collaboration & Cross-Functional Integration: Close collaboration with IT, BI, and operational teams to support a resilient, scalable, and governed data platform.
  • Continuous Learning & Improvement: A commitment to continuous learning, knowledge sharing, and technical mentoring within the team.

🤝 Collaboration Style

  • Cross-Functional Collaboration: Close collaboration with IT, BI, and operational teams to ensure high-quality data delivery and support decision-making across the business.
  • Code Review & Knowledge Sharing: A culture of code review, technical mentoring, and knowledge sharing to ensure high-quality data pipelines and maintainability.
  • Agile Methodologies: The use of Agile/Scrum methodologies to support a collaborative, cross-functional, and continuous improvement approach to data engineering.

⚡ Challenges & Growth Opportunities

🛑 Technical Challenges

  • Data Pipeline Complexity: Designing, developing, and maintaining complex data pipelines that integrate data from various sources, including ERP, CRM, and operational data sources.
  • Data Governance & Security: Ensuring data governance, security, and compliance in a large, enterprise environment with strict regulations and standards.
  • Cloud-Native Platform Migration: Supporting the migration to Microsoft Fabric, ensuring modern, efficient data architecture, and minimizing downtime and data loss.

🌱 Learning & Development Opportunities

  • Technical Skill Development: Continuous learning and development opportunities in data engineering, cloud-native platforms, data governance, and emerging technologies.
  • Leadership & Mentoring: Opportunities for growth in technical leadership, architecture, and data governance, with a focus on knowledge sharing and mentoring within the team.
  • Conferences & Certifications: Opportunities to attend industry conferences, obtain relevant certifications (e.g., Microsoft certifications), and engage with the data engineering community.

💡 Interview Preparation

💭 Technical Questions

  • Data Pipeline Design & Development: Be prepared to discuss your experience in designing, developing, and maintaining data pipelines using relevant technologies.
  • Data Governance & Security: Brush up on your understanding of data governance, security, and compliance aspects, and be ready to discuss your approach to these critical areas.
  • Problem-Solving & Collaboration: Prepare for scenarios that involve problem-solving, collaboration, and cross-functional teamwork, demonstrating your ability to work effectively with various teams.

💬 Company & Culture Questions

  • Company Culture & Values: Research Marshalls PLC's company culture, values, and commitment to digital transformation. Be prepared to discuss how your skills and experience align with the company's goals and objectives.
  • Data-Driven Decision Making: Understand how data-driven decision-making is integral to Marshalls PLC's success, and be ready to discuss your approach to supporting this critical business function.
  • Agile Methodologies: Brush up on your understanding of Agile/Scrum methodologies and be prepared to discuss your experience working in an Agile environment.

📝 Portfolio Presentation Strategy

  • Data Pipeline Projects: Highlight your experience in designing, developing, and maintaining data pipelines using relevant technologies, emphasizing your problem-solving skills and collaboration abilities.
  • Data Governance & Security: Emphasize your understanding of data governance, security, and compliance aspects in your projects, demonstrating your attention to detail and commitment to data quality and integrity.
  • Cloud-Native Platforms: Showcase your experience with cloud-native data platforms and migrations to modern platforms like Microsoft Fabric, highlighting your ability to adapt to new technologies and environments.

📌 Application Steps

To apply for this Data Platform Engineer position:

  1. Submit your application: Through the application link provided on the job listing.
  2. Tailor your resume: Highlight your relevant data engineering experience, skills, and projects, emphasizing your problem-solving skills and collaboration abilities.
  3. Prepare your portfolio: Ensure your portfolio showcases your experience in designing, developing, and maintaining data pipelines using relevant technologies, with a focus on data governance, security, and compliance aspects.
  4. Research the company: Familiarize yourself with Marshalls PLC's company culture, values, and commitment to digital transformation. Be prepared to discuss how your skills and experience align with the company's goals and objectives.
  5. Prepare for technical interviews: Brush up on your data engineering skills, problem-solving abilities, and collaboration strategies. Be ready to discuss your approach to data governance, security, and compliance aspects, as well as your experience working in an Agile environment.

⚠️ Important Notice: This enhanced job description includes AI-generated insights and data engineering industry-standard assumptions. All details should be verified directly with the hiring organization before making application decisions.

Application Requirements

Candidates should have proven experience in data engineering roles with advanced SQL and Python skills. Familiarity with cloud-native platforms and data governance tools is essential, along with strong problem-solving and collaboration abilities.