At a Glance
- Tasks: Design and build robust data platforms using Azure and Databricks.
- Company: Join Jacobs, a global leader in innovative solutions.
- Benefits: Flexible working, well-being benefits, and opportunities for growth.
- Why this job: Make a real impact by solving critical problems with cutting-edge technology.
- Qualifications: Experience with SQL, Python, and cloud data platforms preferred.
- Other info: Inclusive culture that values collaboration and diverse perspectives.
The predicted salary is between 36000 - 60000 £ per year.
At Jacobs, we are challenging today to reinvent tomorrow by solving the world’s most critical problems for thriving cities, resilient environments, mission-critical outcomes, operational advancement, scientific discovery and cutting-edge manufacturing.
Your impact: Enjoy designing elegant data systems, shipping production-grade code, and seeing your work make a measurable difference. Our team builds data platforms and AI solutions that power critical infrastructure, transform operations, and move entire industries.
In this role you will work with a broad set of clients on high-scale problems, with the backing of a global organisation, investing heavily in Azure, Databricks, and applied AI. You will work primarily on Azure + Databricks (Spark, Delta Lake, Unity Catalog) to ship modern ELT/ETL, streaming and batch data products, and ML/AI pipelines, operating at serious scale across water, transport, energy, and more. Join a collaborative, engineering-led culture with real investment in platforms & tooling.
Utilising stack such as:
- Azure ADLS Gen2, Event Hubs, ADF/Azure Data Factory or Synapse pipelines, Functions, Key Vault, VNets
- Databricks Spark, Delta Lake, Unity Catalog, Workflows, MLflow (experiments, model registry)
- Languages: Python (PySpark), SQL (Delta SQL), optional Scala
- Engineering: Git, pull requests, code review, unit/integration tests, dbx, notebooks as code
- Platform & Ops: Azure DevOps/GitHub, CI/CD, Terraform or Bicep, monitoring/alerting
Your remit and responsibilities will include:
- Design & build robust data platforms and pipelines on Azure and Databricks (batch + streaming) using Python/SQL, Spark, Delta Lake, and Data Lakehouse patterns
- Develop AI-enabling foundations feature stores, ML-ready datasets, and automated model-serving pathways (MLflow, model registries, CI/CD)
- Own quality & reliability testing (dbx/pytest), observability (metrics, logging, lineage), and cost/performance optimisation
- Harden for enterprise security-by-design, access patterns with Unity Catalog, data governance, and reproducible environments
- Automate the boring stuff IaC (Terraform/Bicep), CI/CD (Azure DevOps/GitHub Actions), and templated project scaffolding
- Partner with clients to translate business problems into technical plans, run workshops, and present trade-offs with clarity
- Ship value continuously; iterate, review, and release frequently; measure outcomes, not just outputs
Here’s what you’ll need: Our team would be delighted to hear from candidates with a good mix of:
- Utilising SQL and Python for building reliable data pipelines
- Hands-on with Spark (preferably Databricks) and modern data modelling (e.g., Kimball/Inmon/Data Vault, lakehouse)
- Experience running on a cloud data platform (ideally Azure)
- Sound software delivery practices: Git, CI/CD, testing, Agile ways of working
- Streaming/event-driven designs (Event Hubs, Kafka, Structured Streaming)
- MPP/Data Warehouses (Synapse, Snowflake, Redshift) and NoSQL (Cosmos DB)
- ML enablement feature engineering at scale, MLflow, basic model lifecycle know-how
- Infrastructure-as-code (Terraform/Bicep) and platform hardening
Don’t meet every single bullet? We’d still love to hear from you. We hire for mindset and potential as much as current skills!
Joining Jacobs connects you locally but globally. Our values stand on a foundation of safety, integrity, inclusion and belonging. We put people at the heart of our business, and we truly believe that by supporting one another through our culture of caring, we all succeed.
With safety and flexibility always top of mind, we’ve gone beyond traditional ways of working so you have the support, means and space to maximize your potential. You’ll uncover flexible working arrangements, benefits, and opportunities, from well-being benefits to our global giving and volunteering program.
We aim to embed inclusion and belonging in everything we do. We know that if we are inclusive, we’re more connected and creative. We accept people for who they are, and we champion the richness of different perspectives, lived experiences and backgrounds in the workplace, as a source of learning and innovation.
As a disability confident employer, we will interview disabled candidates who best meet the criteria. We welcome applications from candidates who are seeking flexible working and from those who may not meet all the listed requirements for a role.
We value collaboration and believe that in-person interactions are crucial for both our culture and client delivery. We empower employees with our hybrid working policy, allowing them to split their work week between Jacobs offices/projects and remote locations enabling them to deliver their best work.
Your application experience is important to us, and we’re keen to adapt to make every interaction even better. If you require further support or reasonable adjustments with regards to the recruitment process, please contact the team via Careers Support.
Data Engineer - Databricks / AI in Manchester employer: Jacobs
Contact Detail:
Jacobs Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer - Databricks / AI in Manchester
✨Tip Number 1
Network like a pro! Reach out to folks in the industry, attend meetups, and connect with people on LinkedIn. You never know who might have the inside scoop on job openings or can refer you directly.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your data projects, especially those using Azure and Databricks. This gives potential employers a taste of what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on common data engineering questions and practical scenarios. Practice coding challenges and be ready to discuss your past projects in detail—this is your chance to shine!
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen. Plus, we love seeing candidates who are proactive about their job search.
We think you need these skills to ace Data Engineer - Databricks / AI in Manchester
Some tips for your application 🫡
Tailor Your CV: Make sure your CV reflects the skills and experiences that align with the Data Engineer role. Highlight your experience with Azure, Databricks, and any relevant data engineering projects you've worked on.
Craft a Compelling Cover Letter: Use your cover letter to tell us why you're passionate about data engineering and how you can contribute to our mission. Be specific about your experience with Python, SQL, and any AI solutions you've developed.
Showcase Your Projects: If you've worked on any notable projects, especially those involving data pipelines or machine learning, make sure to mention them. We love seeing real-world applications of your skills!
Apply Through Our Website: We encourage you to apply directly through our website for the best chance of getting noticed. It’s the easiest way for us to keep track of your application and ensure it reaches the right people.
How to prepare for a job interview at Jacobs
✨Know Your Tech Stack
Familiarise yourself with the specific technologies mentioned in the job description, like Azure, Databricks, and Python. Be ready to discuss how you've used these tools in past projects, especially in building data pipelines or AI solutions.
✨Showcase Problem-Solving Skills
Prepare examples of how you've tackled complex data challenges. Think about times when you translated business problems into technical solutions, as this aligns with what the company values. Use the STAR method (Situation, Task, Action, Result) to structure your responses.
✨Emphasise Collaboration
Since the role involves working with clients and teams, highlight your experience in collaborative environments. Share stories that demonstrate your ability to work well with others, run workshops, and communicate technical concepts clearly.
✨Ask Insightful Questions
Prepare thoughtful questions about the company's projects, culture, and future goals. This shows your genuine interest in the role and helps you assess if it's the right fit for you. Consider asking about their approach to data governance or how they implement CI/CD practices.