At a Glance
- Tasks: Design and build Databricks Lakehouse solutions to transform data into actionable insights.
- Company: Join The AA, a beloved brand with a rich data heritage.
- Benefits: Enjoy 25 days annual leave, free breakdown membership, and diverse learning opportunities.
- Why this job: Make a real impact in the data world with cutting-edge technology.
- Qualifications: Proven experience with Azure Databricks and strong Python skills required.
- Other info: Hybrid working model with a supportive and inclusive company culture.
The predicted salary is between 36000 - 60000 ÂŁ per year.
Location: Basingstoke (hybrid working 3 office days per week)
Join Our Data & Analytics Team: Transforming Data into Our Superpower! Are you passionate about data and eager to make a significant impact? The AA is a well-loved brand with a range of driver services much wider than most people realise. We have an enviable set of data assets from breakdown, service, repair, insurance, telematics, digital interactions, car dealers and driving school!
Employment Type: Permanent, full time
Additional Benefits: Annual Bonus
Role context: This is the job: At The AA, our purpose is to create confidence for drivers now and for the future. Data plays a critical role in delivering that purpose, and we are investing heavily in a modern, Databricks-centric data platform to unlock the full value of our connected car and insurance data.
This role is for an experienced Databricks Data Engineer. You will be working day-to-day designing, building and operating production-grade Databricks Lakehouse solutions, including structured streaming pipelines, Unity Catalog governance, and Spark-based data engineering at scale.
What will I be doing?
- Designing, building and operating production-grade Databricks Lakehouse solutions, including structured streaming pipelines using Python and PySpark
- Owning and evolving Unity Catalog–based governance, ensuring secure, discoverable and well-managed data assets
- Developing and maintaining event-driven data pipelines, integrating closely with backend engineering teams
- Implementing and supporting CI/CD pipelines in Azure DevOps to enable reliable, automated Databricks deployments
- Creating high-quality, analytics-ready datasets that deliver actionable business insight at scale
- Proactively improving performance, reliability, automation and observability across the Databricks data platform
What do I need?
We are intentionally setting a high bar for Databricks experience. You should be able to demonstrate deep, hands-on capability, not just theoretical knowledge.
Essential Experience:
- Significant, hands-on experience with Azure Databricks in a production environment
- Proven experience building Spark / Databricks pipelines using Python and PySpark
- Strong experience with structured streaming and event-driven architectures
- Practical experience implementing and operating Unity Catalog
- Solid understanding of Lakehouse design principles, including dimensional and analytical modelling
- Experience building and maintaining CI/CD pipelines, ideally using Azure DevOps
- Confidence working with large-scale data, performance tuning, and troubleshooting complex pipelines
You Have What we mean by “Databricks experience”:
- Designed and built Databricks pipelines end-to-end
- Made architectural decisions within Databricks environments
- Worked with Spark internals, optimisation techniques and cluster configuration
- Operated Databricks solutions in live, business-critical contexts
If your Databricks exposure has been limited to minor contributions, proof-of-concepts, or occasional usage alongside other tools, this role is unlikely to be the right fit.
Additional Information: We’re always looking to recognise and reward our employees for the work they do. As a valued member of The AA team, you’ll have access to a range of benefits including:
- 25 days annual leave plus bank holidays + holiday buying scheme
- Worksave pension scheme with up to 7% employer contribution
- Free AA breakdown membership from Day 1 plus 50% discount for family and friends
- Discounts on AA products including car and home insurance
- Employee discount scheme that gives you access to a car salary sacrifice scheme plus great discounts on healthcare, shopping, holidays and more
- Company funded life assurance
- Diverse learning and development opportunities to support you to progress in your career
- Dedicated Employee Assistance Programme and a 24/7 remote GP service for you and your family
- Plus, so much more!
We’re an equal opportunities employer and welcome applications from everyone. The AA values diversity and the difference this brings to our culture and our customers. We actively seek people from diverse backgrounds to join us and become part of an inclusive company where you can be yourself, be empowered to be your best and feel like you truly belong. We have five communities to bring together people with shared characteristics and backgrounds and drive positive change.
Data Engineer - Databricks Specialist in Basingstoke employer: The AA
Contact Detail:
The AA Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer - Databricks Specialist in Basingstoke
✨Tip Number 1
Network like a pro! Reach out to current employees at The AA on LinkedIn and ask about their experiences. A friendly chat can give you insider info and might even lead to a referral!
✨Tip Number 2
Show off your skills! Prepare a mini-project or case study that highlights your Databricks expertise. Bring it up during interviews to demonstrate your hands-on experience and problem-solving abilities.
✨Tip Number 3
Be ready for technical challenges! Brush up on your Python and PySpark skills, and practice explaining your thought process when tackling data engineering problems. Confidence is key!
✨Tip Number 4
Apply through our website! It’s the best way to ensure your application gets noticed. Plus, you’ll be one step closer to joining our awesome Data & Analytics team at The AA!
We think you need these skills to ace Data Engineer - Databricks Specialist in Basingstoke
Some tips for your application 🫡
Show Off Your Databricks Skills: Make sure to highlight your hands-on experience with Databricks in your application. We want to see how you've designed and built pipelines, so don’t hold back on the details!
Be Specific About Your Experience: When you describe your past roles, focus on specific projects where you’ve used Python and PySpark. We love seeing concrete examples of your work, especially in production environments.
Tailor Your Application: Don’t just send a generic CV! Tailor your application to reflect the skills and experiences mentioned in the job description. This shows us that you’re genuinely interested in the role and understand what we’re looking for.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for this exciting opportunity with The AA!
How to prepare for a job interview at The AA
✨Know Your Databricks Inside Out
Make sure you can confidently discuss your hands-on experience with Databricks. Be prepared to share specific examples of how you've designed and built pipelines, as well as any architectural decisions you've made. This will show that you’re not just familiar with the tool but have truly mastered it.
✨Showcase Your Problem-Solving Skills
During the interview, be ready to talk about challenges you've faced while working with data pipelines and how you overcame them. Highlight your troubleshooting techniques and any performance tuning you've done. This demonstrates your ability to handle complex situations effectively.
✨Understand the Business Impact
Articulate how your work with data engineering has contributed to business insights or outcomes. Discuss how your analytics-ready datasets have driven decision-making or improved processes. This shows that you understand the bigger picture and the value of your role.
✨Prepare Questions About Their Data Strategy
Have a few thoughtful questions ready about The AA's data strategy and how they envision the role of a Data Engineer evolving. This not only shows your interest in the position but also your proactive approach to understanding their needs and how you can contribute.