Data Platform Engineer

Data Platform Engineer

Full-Time 48000 - 72000 Β£ / year (est.) No home office possible
M

At a Glance

  • Tasks: Design and maintain cloud-based data infrastructure using Azure and Databricks.
  • Company: Join a leading financial services client focused on innovation.
  • Benefits: Hybrid work model, competitive salary, and opportunities for professional growth.
  • Why this job: Be at the forefront of data technology and make a real impact.
  • Qualifications: 5+ years in Azure services and strong skills in Databricks and Python.
  • Other info: Collaborative environment with excellent career advancement opportunities.

The predicted salary is between 48000 - 72000 Β£ per year.

Permanent Hybrid (3 days in the office, 2 days WFH) London. McCabe & Barton are partnering with a leading financial services client to recruit an experienced Data Platform Engineer. This is an excellent opportunity to join a forward-thinking team driving innovation with modern cloud-based data technologies.

Role Overview

As a Data Platform Engineer, you will design, build, and maintain scalable cloud-based data infrastructure using Azure and Databricks. You will play a key role in ensuring that data pipelines, architecture, and analytics environments are reliable, performant, and secure.

Key Responsibilities

  • Platform Development & Maintenance
    • Design and implement data pipelines using Azure Data Factory, Databricks, and related Azure services.
    • Build ETL/ELT processes to transform raw data into structured, analytics-ready formats.
    • Optimise pipeline performance and ensure high availability of data services.
  • Infrastructure & Architecture
    • Architect and deploy scalable data lake solutions using Azure Data Lake Storage.
    • Implement governance and security measures across the platform.
    • Leverage Terraform or similar IaC tools for controlled and reproducible deployments.
  • Databricks Development
    • Develop and optimise data jobs using PySpark or Scala within Databricks.
    • Implement the medallion architecture (bronze, silver, gold layers) and use Delta Lake for reliable data transactions.
    • Manage cluster configurations and CI/CD pipelines for Databricks deployments.
  • Monitoring & Operations
    • Implement monitoring solutions using Azure Monitor, Log Analytics, and Databricks tools.
    • Optimise performance, ensure SLAs are met, and establish disaster recovery and backup strategies.
  • Collaboration & Documentation
    • Partner with data scientists, analysts, and business stakeholders to deliver effective solutions.
    • Document technical designs, data flows, and operational procedures for knowledge sharing.

Essential Skills & Experience

  • 5+ years of experience with Azure services (Azure Data Factory, ADLS, Azure SQL Database, Synapse Analytics).
  • Strong hands-on expertise in Databricks, Delta Lake, and cluster management.
  • Proficiency in SQL and Python for pipeline development.
  • Familiarity with Git/GitHub and CI/CD practices.
  • Understanding of data modelling, data governance, and security principles.

Desirable Skills

  • Experience with Terraform or other Infrastructure-as-Code tools.
  • Familiarity with Azure DevOps or similar CI/CD platforms.
  • Experience with data quality frameworks and testing.
  • Azure Data Engineer or Databricks certifications.

Please apply with an updated CV if you align to the key skills required!

Data Platform Engineer employer: Mccabe & Barton

At McCabe & Barton, we pride ourselves on being an exceptional employer, offering a dynamic work culture that fosters innovation and collaboration. As a Data Platform Engineer in London, you'll benefit from a hybrid working model, competitive remuneration, and ample opportunities for professional growth within a leading financial services environment that embraces modern cloud technologies.
M

Contact Detail:

Mccabe & Barton Recruiting Team

StudySmarter Expert Advice 🀫

We think this is how you could land Data Platform Engineer

✨Tip Number 1

Network like a pro! Reach out to your connections in the industry, attend meetups, and engage with professionals on LinkedIn. We all know that sometimes it’s not just what you know, but who you know that can help you land that Data Platform Engineer role.

✨Tip Number 2

Show off your skills! Create a portfolio showcasing your projects, especially those involving Azure, Databricks, and data pipelines. We recommend sharing this on platforms like GitHub or even your own website to give potential employers a taste of what you can do.

✨Tip Number 3

Prepare for interviews by brushing up on your technical knowledge and soft skills. We suggest practising common interview questions related to data engineering and being ready to discuss your past experiences with Azure services and Databricks.

✨Tip Number 4

Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, we’re always looking for talented individuals like you to join our forward-thinking team!

We think you need these skills to ace Data Platform Engineer

Azure Data Factory
Databricks
ETL/ELT processes
Azure Data Lake Storage
Terraform
PySpark
Scala
Delta Lake
SQL
Python
Git/GitHub
CI/CD practices
Data modelling
Data governance
Security principles

Some tips for your application 🫑

Tailor Your CV: Make sure your CV highlights your experience with Azure services and Databricks. We want to see how your skills align with the role, so don’t be shy about showcasing relevant projects or achievements!

Showcase Your Technical Skills: When writing your application, emphasise your hands-on expertise in SQL, Python, and any Infrastructure-as-Code tools like Terraform. We love seeing candidates who can demonstrate their technical prowess clearly.

Be Clear and Concise: Keep your application straightforward and to the point. Use bullet points for key responsibilities and achievements to make it easy for us to see your qualifications at a glance.

Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for this exciting opportunity!

How to prepare for a job interview at Mccabe & Barton

✨Know Your Tech Stack

Make sure you’re well-versed in Azure services, Databricks, and the tools mentioned in the job description. Brush up on your knowledge of data pipelines, ETL processes, and the medallion architecture. Being able to discuss these topics confidently will show that you’re ready to hit the ground running.

✨Showcase Your Problem-Solving Skills

Prepare to discuss specific challenges you've faced in previous roles and how you overcame them. Think about times when you optimised pipeline performance or implemented security measures. Real-world examples will demonstrate your hands-on experience and critical thinking abilities.

✨Collaboration is Key

Since the role involves partnering with data scientists and business stakeholders, be ready to talk about your teamwork experiences. Highlight any projects where you collaborated across teams, and how you communicated technical concepts to non-technical colleagues. This will show that you can bridge the gap between tech and business.

✨Ask Insightful Questions

Prepare a few thoughtful questions about the company’s data strategy, team dynamics, or future projects. This not only shows your interest in the role but also gives you a chance to assess if the company culture aligns with your values. Plus, it keeps the conversation engaging!

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

M
Similar positions in other companies
UK’s top job board for Gen Z
discover-jobs-cta
Discover now
>