Senior Data Engineer in Glasgow

Senior Data Engineer in Glasgow

Glasgow Full-Time 48000 - 72000 £ / year (est.) No home office possible
H

At a Glance

  • Tasks: Engineer scalable ELT pipelines and build a modern data platform from scratch.
  • Company: Join a leading UK consumer brand on an exciting data transformation journey.
  • Benefits: Competitive salary, flexible working, and opportunities for professional growth.
  • Why this job: Be at the forefront of data innovation and influence architecture decisions.
  • Qualifications: Experience in Data Engineering with Azure and Databricks is essential.
  • Other info: Dynamic team environment with a focus on collaboration and cutting-edge technology.

The predicted salary is between 48000 - 72000 £ per year.

Our incredibly successful client, a consumer brand, is undertaking a major data modernisation programme, moving away from legacy systems, manual Excel reporting, and fragmented data sources into a fully automated Azure Enterprise Landing Zone + Databricks Lakehouse. They are building a modern data platform from the ground up using Lakeflow Declarative Pipelines, Unity Catalog, and Azure Data Factory, and this role sits right at the heart of that transformation. This is a rare opportunity to join early, influence architecture, and help define engineering standards, pipelines, curated layers, and best practices that will support Operations, Finance, Sales, Logistics, and Customer Care.

If you want to build a best-in-class Lakehouse from scratch, this is the one.

What You'll Be Doing
  • Lakehouse Engineering (Azure + Databricks)
  • Engineer scalable ELT pipelines using Lakeflow Declarative Pipelines, PySpark, and Spark SQL across a full Medallion Architecture (Bronze - Silver - Gold).
  • Implement ingestion patterns for files, APIs, SaaS platforms (e.g. subscription billing), SQL sources, SharePoint, and SFTP using ADF + metadata-driven frameworks.
  • Apply Lakeflow expectations for data quality, schema validation, and operational reliability.
  • Curated Data Layers & Modelling
    • Build clean, conformed Silver/Gold models aligned to enterprise business domains (customers, subscriptions, deliveries, finance, credit, logistics, operations).
    • Deliver star schemas, harmonisation logic, SCDs, and business marts to power high-performance Power BI datasets.
    • Apply governance, lineage, and fine-grained permissions via Unity Catalog.
  • Orchestration & Observability
    • Design and optimise orchestration using Lakeflow Workflows and Azure Data Factory.
    • Implement monitoring, alerting, SLAs/SLIs, runbooks, and cost-optimisation across the platform.
  • DevOps & Platform Engineering
    • Build CI/CD pipelines in Azure DevOps for notebooks, Lakeflow pipelines, SQL models, and ADF artefacts.
    • Ensure secure, enterprise-grade platform operation across Dev Prod, using private endpoints, managed identities, and Key Vault.
    • Contribute to platform standards, design patterns, code reviews, and future roadmap.
  • Collaboration & Delivery
    • Work closely with BI/Analytics teams to deliver curated datasets powering dashboards across the organisation.
    • Influence architecture decisions and uplift engineering maturity within a growing data function.
    Tech Stack You'll Work With
    • Databricks: Lakeflow Declarative Pipelines, Workflows, Unity Catalog, SQL Warehouses
    • Azure: ADLS Gen2, Data Factory, Key Vault, vNets & Private Endpoints
    • Languages: PySpark, Spark SQL, Python, Git
    • DevOps: Azure DevOps Repos, Pipelines, CI/CD
    • Analytics: Power BI, Fabric
    What We're Looking For
    • Experience
    • Significant commercial experience of Data Engineering with years delivering production workloads on Azure + Databricks.
    • Strong PySpark/Spark SQL and distributed data processing expertise.
    • Proven Medallion/Lakehouse delivery experience using Delta Lake.
    • Solid dimensional modelling (Kimball) including surrogate keys, SCD types 1/2, and merge strategies.
    • Operational experience - SLAs, observability, idempotent pipelines, reprocessing, backfills.
  • Mindset
    • Strong grounding in secure Azure Landing Zone patterns.
    • Comfort with Git, CI/CD, automated deployments, and modern engineering standards.
    • Clear communicator who can translate technical decisions into business outcomes.
  • Nice to Have
    • Databricks Certified Data Engineer Associate
    • Streaming ingestion experience (Auto Loader, structured streaming, watermarking)
    • Subscription/entitlement modelling experience
    • Advanced Unity Catalog security (RLS, ABAC, PII governance)
    • Terraform/Bicep for IaC
    • Fabric Semantic Model / Direct Lake optimisation

    Senior Data Engineer in Glasgow employer: Head Resourcing

    Join a leading UK consumer business in Glasgow as a Senior Data Engineer and be part of an exciting data modernisation journey. With a strong focus on employee growth, this company fosters a collaborative work culture that encourages innovation and the sharing of ideas, while offering competitive benefits and opportunities to influence architecture and engineering standards from the ground up. Experience the unique advantage of working in a vibrant city with a commitment to building a best-in-class data platform.
    H

    Contact Detail:

    Head Resourcing Recruiting Team

    StudySmarter Expert Advice 🤫

    We think this is how you could land Senior Data Engineer in Glasgow

    ✨Tip Number 1

    Network like a pro! Reach out to folks in your industry on LinkedIn or at local meetups. You never know who might have the inside scoop on job openings or can put in a good word for you.

    ✨Tip Number 2

    Show off your skills! Create a portfolio or GitHub repository showcasing your projects, especially those related to Azure and Databricks. This gives potential employers a taste of what you can do beyond your CV.

    ✨Tip Number 3

    Prepare for interviews by practising common data engineering questions and scenarios. Think about how you’d tackle real-world problems using Lakeflow and Azure. Confidence is key!

    ✨Tip Number 4

    Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, we love seeing candidates who are proactive about their job search.

    We think you need these skills to ace Senior Data Engineer in Glasgow

    Azure Data Factory
    Databricks Lakehouse
    Lakeflow Declarative Pipelines
    PySpark
    Spark SQL
    Medallion Architecture
    Data Modelling
    Dimensional Modelling (Kimball)
    CI/CD Pipelines
    Git
    Power BI
    Observability
    Data Governance
    Azure DevOps
    Streaming Ingestion

    Some tips for your application 🫡

    Tailor Your CV: Make sure your CV is tailored to the Senior Data Engineer role. Highlight your experience with Azure, Databricks, and any relevant projects that showcase your skills in building scalable ELT pipelines and data modelling.

    Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how your background aligns with our mission at StudySmarter. Don’t forget to mention specific technologies you’ve worked with that relate to the job description.

    Showcase Your Projects: If you've worked on any notable projects, especially those involving Lakehouse architecture or Azure Data Factory, make sure to include them. We love seeing real-world applications of your skills, so don’t hold back!

    Apply Through Our Website: We encourage you to apply through our website for a smoother application process. It helps us keep track of your application and ensures you’re considered for this exciting opportunity with us at StudySmarter!

    How to prepare for a job interview at Head Resourcing

    ✨Know Your Tech Stack Inside Out

    Make sure you’re well-versed in Azure, Databricks, and the specific tools mentioned in the job description. Brush up on your PySpark and Spark SQL skills, and be ready to discuss how you've used them in past projects. This will show that you’re not just familiar with the tech but can also apply it effectively.

    ✨Prepare for Scenario-Based Questions

    Expect questions that ask you to solve real-world problems related to data engineering. Think about how you would design scalable ELT pipelines or implement ingestion patterns. Practising these scenarios will help you articulate your thought process clearly during the interview.

    ✨Showcase Your Collaboration Skills

    This role involves working closely with BI/Analytics teams, so be prepared to discuss how you’ve collaborated in the past. Share examples of how you’ve influenced architecture decisions or improved engineering standards in previous roles. This will highlight your ability to work within a team and drive results.

    ✨Ask Insightful Questions

    At the end of the interview, don’t forget to ask questions that demonstrate your interest in the company’s data transformation journey. Inquire about their current challenges with legacy systems or how they envision the future of their data platform. This shows that you’re genuinely interested in contributing to their success.

    Senior Data Engineer in Glasgow
    Head Resourcing
    Location: Glasgow

    Land your dream job quicker with Premium

    You’re marked as a top applicant with our partner companies
    Individual CV and cover letter feedback including tailoring to specific job roles
    Be among the first applications for new jobs with our AI application
    1:1 support and career advice from our career coaches
    Go Premium

    Money-back if you don't land a job in 6-months

    H
    Similar positions in other companies
    UK’s top job board for Gen Z
    discover-jobs-cta
    Discover now
    >