Data Engineer - Databricks / AI in Glasgow
Data Engineer - Databricks / AI

Data Engineer - Databricks / AI in Glasgow

Glasgow Full-Time 36000 - 60000 £ / year (est.) No home office possible
J

At a Glance

  • Tasks: Design and build robust data platforms using Azure and Databricks.
  • Company: Join Jacobs, a global leader in innovative solutions for critical infrastructure.
  • Benefits: Flexible working arrangements, well-being benefits, and opportunities for personal growth.
  • Why this job: Make a real impact by solving high-scale problems with cutting-edge technology.
  • Qualifications: Experience with SQL, Python, and cloud data platforms like Azure.
  • Other info: Inclusive culture that values collaboration and diverse perspectives.

The predicted salary is between 36000 - 60000 £ per year.

At Jacobs, we are challenging today to reinvent tomorrow by solving the world’s most critical problems for thriving cities, resilient environments, mission-critical outcomes, operational advancement, scientific discovery and cutting-edge manufacturing.

Your impact: Enjoy designing elegant data systems, shipping production-grade code, and seeing your work make a measurable difference. Our team builds data platforms and AI solutions that power critical infrastructure, transform operations, and move entire industries.

In this role you will work with a broad set of clients on high-scale problems, with the backing of a global organisation, investing heavily in Azure, Databricks, and applied AI. You will primarily work on Azure + Databricks (Spark, Delta Lake, Unity Catalog) to ship modern ELT/ETL, streaming and batch data products, and ML/AI pipelines, operating at serious scale across water, transport, energy, and more. Join a collaborative, engineering-led culture with real investment in platforms & tooling.

Utilising stack such as:

  • Azure ADLS Gen2, Event Hubs, ADF/Azure Data Factory or Synapse pipelines, Functions, Key Vault, VNets
  • Databricks Spark, Delta Lake, Unity Catalog, Workflows, MLflow (experiments, model registry)
  • Languages: Python (PySpark), SQL (Delta SQL), optional Scala
  • Engineering: Git, pull requests, code review, unit/integration tests, dbx, notebooks as code
  • Platform & Ops: Azure DevOps/GitHub, CI/CD, Terraform or Bicep, monitoring/alerting

Your remit and responsibilities will include:

  • Design & build robust data platforms and pipelines on Azure and Databricks (batch + streaming) using Python/SQL, Spark, Delta Lake, and Data Lakehouse patterns
  • Develop AI-enabling foundations feature stores, ML-ready datasets, and automated model-serving pathways (MLflow, model registries, CI/CD)
  • Own quality & reliability testing (dbx/pytest), observability (metrics, logging, lineage), and cost/performance optimisation
  • Harden for enterprise security-by-design, access patterns with Unity Catalog, data governance, and reproducible environments
  • Automate the boring stuff IaC (Terraform/Bicep), CI/CD (Azure DevOps/GitHub Actions), and templated project scaffolding
  • Partner with clients to translate business problems into technical plans, run workshops, and present trade-offs with clarity
  • Ship value continuously; iterate, review, and release frequently; measure outcomes, not just outputs

Here’s what you’ll need: Our team would be delighted to hear from candidates with a good mix of:

  • Utilising SQL and Python for building reliable data pipelines
  • Hands-on with Spark (preferably Databricks) and modern data modelling (e.g., Kimball/Inmon/Data Vault, lakehouse)
  • Experience running on a cloud data platform (ideally Azure)
  • Sound software delivery practices: Git, CI/CD, testing, Agile ways of working
  • Streaming/event-driven designs (Event Hubs, Kafka, Structured Streaming)
  • MPP/Data Warehouses (Synapse, Snowflake, Redshift) and NoSQL (Cosmos DB)
  • ML enablement feature engineering at scale, MLflow, basic model lifecycle know-how
  • Infrastructure-as-code (Terraform/Bicep) and platform hardening

Don’t meet every single bullet? We’d still love to hear from you. We hire for mindset and potential as much as current skills!

Joining Jacobs connects you locally but globally. Our values stand on a foundation of safety, integrity, inclusion and belonging. We put people at the heart of our business, and we truly believe that by supporting one another through our culture of caring, we all succeed.

With safety and flexibility always top of mind, we’ve gone beyond traditional ways of working so you have the support, means and space to maximize your potential. You’ll uncover flexible working arrangements, benefits, and opportunities, from well-being benefits to our global giving and volunteering program.

We aim to embed inclusion and belonging in everything we do. We know that if we are inclusive, we’re more connected and creative. We accept people for who they are, and we champion the richness of different perspectives, lived experiences and backgrounds in the workplace, as a source of learning and innovation.

Jacobs partners with VERCIDA to help us attract and retain talent from a wide range of backgrounds. As a disability confident employer, we will interview disabled candidates who best meet the criteria. We welcome applications from candidates who are seeking flexible working and from those who may not meet all the listed requirements for a role.

We value collaboration and believe that in-person interactions are crucial for both our culture and client delivery. We empower employees with our hybrid working policy, allowing them to split their work week between Jacobs offices/projects and remote locations enabling them to deliver their best work.

Your application experience is important to us, and we’re keen to adapt to make every interaction even better. If you require further support or reasonable adjustments with regards to the recruitment process, please contact the team via Careers Support.

Data Engineer - Databricks / AI in Glasgow employer: Jacobs

At Jacobs, we pride ourselves on being an exceptional employer, offering a collaborative and engineering-led culture that prioritises safety, integrity, and inclusion. Our commitment to employee growth is evident through flexible working arrangements, comprehensive well-being benefits, and opportunities for professional development, all while working on impactful projects that transform communities globally. Join us in a dynamic environment where your contributions make a measurable difference, and discover how you can thrive within our supportive and innovative team.
J

Contact Detail:

Jacobs Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Data Engineer - Databricks / AI in Glasgow

✨Tip Number 1

Network like a pro! Reach out to folks in the industry, attend meetups, and connect with people on LinkedIn. You never know who might have the inside scoop on job openings or can refer you directly.

✨Tip Number 2

Show off your skills! Create a portfolio showcasing your data projects, especially those using Azure and Databricks. This gives potential employers a taste of what you can do and sets you apart from the crowd.

✨Tip Number 3

Prepare for interviews by brushing up on common data engineering questions and practical scenarios. Practice coding challenges and be ready to discuss your past projects in detail—this is your chance to shine!

✨Tip Number 4

Don’t forget to apply through our website! It’s the best way to ensure your application gets seen. Plus, we love seeing candidates who are proactive about their job search.

We think you need these skills to ace Data Engineer - Databricks / AI in Glasgow

Python
SQL
Databricks
Spark
Delta Lake
Azure Data Factory
Event Hubs
MLflow
Infrastructure as Code (Terraform/Bicep)
CI/CD
Data Modelling (Kimball/Inmon/Data Vault)
Streaming/Event-Driven Design
Data Governance
Observability
Agile Methodologies

Some tips for your application 🫡

Tailor Your Application: Make sure to customise your CV and cover letter for the Data Engineer role. Highlight your experience with Azure, Databricks, and Python, as these are key to what we’re looking for. Show us how your skills align with our mission to solve critical problems!

Showcase Your Projects: Don’t just list your skills; give us examples! Share specific projects where you’ve built data pipelines or worked with ML/AI solutions. We love seeing how you’ve made a measurable difference in your previous roles.

Be Clear and Concise: When writing your application, keep it straightforward. Use clear language and avoid jargon unless it’s relevant. We appreciate a well-structured application that gets straight to the point—just like we do in our engineering work!

Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way to ensure your application gets into the right hands. Plus, you’ll find all the details about the role and our company culture there!

How to prepare for a job interview at Jacobs

✨Know Your Tech Stack

Make sure you’re well-versed in the technologies mentioned in the job description, especially Azure, Databricks, and Python. Brush up on your knowledge of Spark, Delta Lake, and data modelling techniques like Kimball or Data Vault. Being able to discuss these confidently will show that you’re ready to hit the ground running.

✨Prepare for Scenario-Based Questions

Expect questions that ask you to solve real-world problems using your technical skills. Think about how you would design a data pipeline or troubleshoot an issue with a data platform. Practising these scenarios can help you articulate your thought process clearly during the interview.

✨Showcase Your Collaboration Skills

Since the role involves working with clients and teams, be prepared to discuss your experience in collaborative environments. Share examples of how you’ve worked with others to translate business needs into technical solutions. This will highlight your ability to communicate effectively and work as part of a team.

✨Ask Insightful Questions

At the end of the interview, don’t forget to ask questions that demonstrate your interest in the company and the role. Inquire about their current projects, the team culture, or how they measure success in this position. This shows that you’re genuinely interested and engaged in the opportunity.

Data Engineer - Databricks / AI in Glasgow
Jacobs
Location: Glasgow

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

J
Similar positions in other companies
UK’s top job board for Gen Z
discover-jobs-cta
Discover now
>