Senior Data Engineer

Senior Data Engineer

Full-Time 36000 - 60000 £ / year (est.) Home office (partial)
Markel

At a Glance

  • Tasks: Design and develop innovative data solutions using Azure technologies.
  • Company: Join Markel, a global leader in insurance and reinsurance.
  • Benefits: Competitive salary, bonuses, 25 days holiday, and strong benefits package.
  • Why this job: Make a meaningful impact while working with cutting-edge data technologies.
  • Qualifications: 3+ years of experience in Azure Databricks and data engineering.
  • Other info: Flexible working patterns and a commitment to diversity.

The predicted salary is between 36000 - 60000 £ per year.

Are you an experienced Data Engineer looking for your next career move? Help us assess the needs of customers, external business partners and contribute to the solution design and development to enable Markel to drive our desired business outcomes.

If you’re looking for a place where you can make a meaningful difference, you’ve found it. The work we do at Markel gives people the confidence to move forward and seize opportunities, and you’ll find your fit amongst our global community of optimists and problem‑solvers. We’re always pushing each other to go further because we believe that when we realise our potential, we can help others reach theirs. Join us and play your part in something special!

The opportunity: Working as part of a small but friendly team, the role will be responsible for the design and development of critical initiatives and the implementation of our business solutions. As a Senior Data Engineer, you will apply your design and development skills to help solve company challenges, understanding the current state of system functionality and domain standard processes, assessing the needs of collaborators including our external business partners, contributing to solution designs, and developing the solutions needed to achieve desired business outcomes. You will operate in an agile, collaborative environment that values your insight, encourages you to take on new responsibility, promotes continuous learning, and rewards innovation.

What you’ll be doing:

  • Assist with design & development of Azure Data offerings such as Databricks, PySpark, Spark SQL and ADLS
  • If necessary, create prototypes to validate proposed ideas and solicit input from stakeholders
  • Excellent grasp of and expertise with test‑driven development and continuous integration processes
  • Analysis and Design – Converts high‑level design to low‑level design and implements it
  • Collaborate with Team Leads to define/clarify business requirements, estimate development costs, and finalise work plans
  • Create and run unit and integration tests throughout the development lifecycle
  • Benchmark application code proactively to prevent performance and scalability concerns
  • Collaborate with the Quality Assurance Team on issue reporting, resolution, and change management
  • Support and Troubleshooting – Assist the Operations Team with any environmental issues that arise during application deployment in the Development, QA, and Production environments
  • Assist other teams in resolving issues that may develop as a result of applications or the integration of multiple components
  • Create and maintain appropriate technical documentation and other project artefacts
  • Develop, test, and iterate on MVP solutions to operationalise new features and products

Our must-haves:

  • Databricks & Lakehouse Expertise
  • 3+ years of experience delivering cloud solutions using Azure Databricks, Delta Tables, Azure Data Factory, ADLS, Azure Data Lake, Azure VM.
  • Deep understanding of Delta Lake, including ACID transactions, schema enforcement, time travel.
  • Ability to design Medallion (Bronze–Silver–Gold) architectures for scalable analytics.
  • Experience with Unity Catalog for data governance, RBAC, lineage and secure data access
  • Strong knowledge of Databricks Workflows, Jobs, Repos, Asset Bundles, and CI/CD integrations.
  • Advanced Apache Spark (PySpark/SQL)
  • Expert in PySpark and Spark SQL performance tuning (partitioning, caching, AQE, skew mitigation, broadcast joins).
  • Skilled in building high‑throughput ETL/ELT pipelines and optimising clusters.

Cloud Platform Skills (Azure‑focused):

  • Azure Data Factory
  • Azure Databricks
  • ADLS Gen2
  • Azure Functions
  • Storage Accounts, Key Vault

Data Engineering Foundations:

  • Data modelling: dimensional modelling, relational design, semantic models.
  • Strong SQL—complex transformations, windowing, analytical queries.
  • Experience with structured, semi‑structured, and streaming data ingestion (Auto Loader, Structured Streaming).

CI/CD & DevOps:

  • Using Azure DevOps, GitHub Actions, or Databricks CLI/Repos for automated deployments.
  • Managing multiple environments (dev/test/prod) with parameterisation and environment config.

Data Quality, Observability & Governance:

  • Building automated data quality frameworks (expectations, anomaly detection).
  • Monitoring pipelines, implementing alerting.
  • Experience ensuring regulatory compliance.

Desirable:

  • Experience gained in either an insurance/insurance‑related business or a fast‑paced financial services environment.
  • Experience working with toolset such as Jira, Azure DevOps, Ataccama or Confluence.

Who we are: Markel Group (NYSE – MKL) a Fortune 500 company with over 60 offices in 20+ countries, is a holding company for insurance, reinsurance, specialist advisory, and investment operations around the world. We’re all about people | We win together | We strive for better | We enjoy the everyday | We think further.

What’s in it for you?

  • A great starting salary plus annual bonus & strong benefits package…
  • 25 days paid holiday plus Bank Holidays, with the opportunity to buy/sell extra leave
  • Fantastic company pension scheme, private medical and dental cover, life assurance, travel insurance cover, income protection, season ticket loan as well as other great benefits on offer
  • There are countless opportunities to learn new skills and develop in your career and we can provide the support needed to do just that!

Are you ready to play your part? Choose ‘Apply Now’ to fill out our short application, so that we can find out more about you. Markel celebrates the value of a diverse workforce that brings experience and expertise from a wide variety of backgrounds and life circumstances. Whatever your background, if you feel you meet the requirements of this role then we want to hear from you. We are also happy to consider candidates who are looking for flexible working patterns. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, colour, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided with all reasonable accommodations to be able to participate in the job application or interview process and to perform essential job functions if successful.

Senior Data Engineer employer: Markel

At Markel, we pride ourselves on being an exceptional employer that fosters a collaborative and innovative work culture. As a Senior Data Engineer, you'll enjoy a competitive salary, comprehensive benefits including a strong pension scheme and private medical cover, and ample opportunities for professional growth in a supportive environment. Join our global community of problem-solvers and make a meaningful impact while enjoying the flexibility of hybrid working arrangements.
Markel

Contact Detail:

Markel Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Senior Data Engineer

✨Tip Number 1

Network like a pro! Reach out to your connections in the industry, attend meetups, and engage on platforms like LinkedIn. You never know who might have the inside scoop on job openings or can refer you directly.

✨Tip Number 2

Prepare for those interviews! Research the company, understand their products, and be ready to discuss how your skills align with their needs. Practise common interview questions and have your own questions ready to show your interest.

✨Tip Number 3

Showcase your projects! Whether it's through a portfolio or GitHub, let your work speak for itself. Highlight any relevant experience with Azure Data offerings and demonstrate your problem-solving skills.

✨Tip Number 4

Apply through our website! It’s the best way to ensure your application gets seen. Plus, it shows you’re genuinely interested in being part of our team at Markel. Don’t miss out on this opportunity!

We think you need these skills to ace Senior Data Engineer

Azure Databricks
Delta Tables
Azure Data Factory
ADLS Gen2
Data Lake
Apache Spark (PySpark/SQL)
ETL/ELT pipelines
SQL
Data modelling
CI/CD
DevOps
Data Quality frameworks
Anomaly detection
Monitoring pipelines
Technical documentation

Some tips for your application 🫡

Tailor Your Application: Make sure to customise your CV and cover letter to highlight your experience with Azure Databricks, Delta Tables, and other key skills mentioned in the job description. We want to see how your background aligns with what we're looking for!

Showcase Your Projects: Include specific examples of projects you've worked on that demonstrate your expertise in data engineering. Whether it's building ETL pipelines or optimising performance, we love to see real-world applications of your skills.

Be Clear and Concise: When writing your application, keep it straightforward and to the point. Use bullet points where possible to make it easy for us to read through your qualifications and experiences quickly.

Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it’s super easy!

How to prepare for a job interview at Markel

✨Know Your Tech Inside Out

Make sure you brush up on your knowledge of Azure Databricks, Delta Tables, and the Medallion architecture. Be ready to discuss how you've used these technologies in past projects and how they can be applied to solve challenges at Markel.

✨Showcase Your Problem-Solving Skills

Prepare examples of how you've tackled complex data engineering problems. Think about specific instances where your insights led to innovative solutions or improved processes, especially in an agile environment.

✨Understand the Business Context

Familiarise yourself with Markel's business model and how data engineering plays a role in driving outcomes. This will help you align your answers with their goals and demonstrate that you're not just a techie but also a strategic thinker.

✨Ask Insightful Questions

Prepare thoughtful questions about the team dynamics, ongoing projects, and how success is measured in the role. This shows your genuine interest in the position and helps you gauge if it's the right fit for you.

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

>